Mastering DRF Throttling: Why It Matters and How to Configure, Apply, and Customize
Using nginx’s limit_req to cut off traffic at the edge is undeniably effective. However, when you need different limits per view or action—for example, 5 logins per minute, 20 uploads per day, or 1,000 reads per user—you can’t rely solely on server‑side settings that may change. DRF’s throttling lets you enforce endpoint‑specific limits right in your application code, making it an essential skill.
Why Throttling Is Needed
DRF throttling decides whether to allow a request, much like a permission, but it’s temporary rather than permanent. The documentation describes it as a way to control the rate at which a client can hit the API.
Typical use cases include:
- Mitigating abuse or attacks: brute‑force logins, spam, crawling, simple DoS
- Protecting costly resources: uploads, external API calls, heavy queries, generative‑AI endpoints
- Ensuring fair use: preventing a single user or key from monopolizing resources
- Codifying policies: “This API allows N requests per minute” managed in code rather than infrastructure
Throttling can be applied globally or to specific views/actions—something nginx alone can’t handle.
How DRF Throttling Works (Key Concepts)
1) Rate Strings
DRF uses strings like "100/day" or "60/min" to define limits.
2) Who Is Limited? (Client Identification)
- UserRateThrottle: Authenticated users are limited by user ID; unauthenticated requests are limited by IP.
- AnonRateThrottle: Only unauthenticated requests are limited by IP.
- ScopedRateThrottle: Applies a policy based on a named scope (e.g.,
uploads).
IP detection relies on X-Forwarded-For or REMOTE_ADDR; NUM_PROXIES is crucial when behind a proxy.
3) State Storage
The default implementation stores counts in the Django cache backend. With a single process/server, LocMemCache suffices, but in a multi‑worker or multi‑replica setup, a shared cache like Redis is essential.
Global Settings: The Quickest Start
REST_FRAMEWORK = {
"DEFAULT_THROTTLE_CLASSES": [
"rest_framework.throttling.AnonRateThrottle",
"rest_framework.throttling.UserRateThrottle",
],
"DEFAULT_THROTTLE_RATES": {
"anon": "100/day",
"user": "1000/day",
},
}
This applies a baseline policy to all endpoints. When a request exceeds the limit, DRF returns HTTP 429 (Too Many Requests).
Applying Throttles to Specific Views
1) Class‑Based Views (APIView)
from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework.throttling import UserRateThrottle
class ExpensiveView(APIView):
throttle_classes = [UserRateThrottle]
def get(self, request):
return Response({"ok": True})
2) Function‑Based Views (@api_view)
from rest_framework.decorators import api_view, throttle_classes
from rest_framework.throttling import UserRateThrottle
from rest_framework.response import Response
@api_view(["GET"])
@throttle_classes([UserRateThrottle])
def ping(request):
return Response({"pong": True})
3) ViewSet Actions (@action)
from rest_framework.decorators import action
from rest_framework.throttling import UserRateThrottle
from rest_framework.viewsets import ViewSet
from rest_framework.response import Response
class ItemViewSet(ViewSet):
@action(detail=True, methods=["post"], throttle_classes=[UserRateThrottle])
def purchase(self, request, pk=None):
return Response({"purchased": pk})
Action‑level throttles override the ViewSet‑level configuration.
ScopedRateThrottle: Fine‑Grained Policies (Highly Recommended)
Scoped throttling lets you separate policies by meaningful names, keeping operations tidy.
REST_FRAMEWORK = {
"DEFAULT_THROTTLE_CLASSES": [
"rest_framework.throttling.ScopedRateThrottle",
],
"DEFAULT_THROTTLE_RATES": {
"login": "5/min",
"uploads": "20/day",
"search": "60/min",
},
}
Declare the scope in the view:
from rest_framework.views import APIView
from rest_framework.response import Response
class LoginView(APIView):
throttle_scope = "login"
def post(self, request):
return Response({"ok": True})
DRF builds a unique key from the scope plus user ID or IP.
Custom Throttles: Defining Your Own Key
While built‑in throttles cover many scenarios, real‑world needs often require custom logic.
1) Common Pattern: Inherit from SimpleRateThrottle
from rest_framework.throttling import SimpleRateThrottle
class LoginBurstThrottle(SimpleRateThrottle):
scope = "login"
def get_cache_key(self, request, view):
username = (request.data.get("username") or "").lower().strip()
ident = self.get_ident(request) # IP based
if not username:
return None
return f"throttle_login:{ident}:{username}"
Register the scope rate:
REST_FRAMEWORK = {
"DEFAULT_THROTTLE_CLASSES": [
"path.to.LoginBurstThrottle",
],
"DEFAULT_THROTTLE_RATES": {
"login": "5/min",
},
}
2) Using a Different Cache
from django.core.cache import caches
from rest_framework.throttling import AnonRateThrottle
class CustomCacheAnonThrottle(AnonRateThrottle):
cache = caches["alternate"]
Deployment Checklist
1) Proxy‑Aware IP Detection
Ensure NUM_PROXIES matches the number of proxies so each user is identified correctly.
2) Avoid LocMemCache in Multi‑Worker Environments
Shared caches like Redis prevent per‑worker count drift.
3) Race Conditions
Built‑in throttles may allow a few extra requests under high concurrency. For critical limits (e.g., payments), consider atomic counters such as Redis INCR + EXPIRE.
4) Client Experience: 429 & Retry‑After
DRF returns 429 by default. Implement wait() in your throttle to send a Retry-After header.
Takeaway: Combine nginx and DRF Throttling

- nginx: Front‑line shield against bulk traffic and attacks.
- DRF throttling: Fine‑tuned, endpoint‑aware policies that stay in code even when infrastructure changes.
For per‑view, per‑action limits, DRF throttling is the most flexible and maintainable approach.