Rate limit DRF endpoint with redis working example
I will try to be as practical as possible in this article. for more theoretical information, you can use this DRF throttling guide
let’s consider that you have a Django Rest project up and running, and you want to rate limit an endpoint. for that, we will use DRF throttling classes, and we will need Redis to store the rate limit data.
Install Redis as cache backend for Django and DRF. I will use Rocky Linux 9 for this example.
1 Install Redis service
2 Enable, Start and check the status of Redis
It should look like this:
next, we need to install the python client for Redis
3 Install Redis python client
4 Install Django Redis Cache Backend
5 enable django-redis cache
6 enable DRF throttling
7 identify the endpoint you want to rate limit.
8 Test Functionality
In my case, hi endpoint responds with a simple “msg”: “Hello Hello!” message. let’t use this simple for loop to send 101 requests to this endpoint and see when drf will notify us about limit.
when we reach the limit, drf will respond with a 429 status code and a message like this:
9 Conclusion
this is one of the simplest ways to rate limit an endpoint in DRF. sometimes this is not enough, and you need to have more control over the rate limit.
there is a more advanced way to do this using the Redis and leacky bucket algorithm. I will write about it in the near future.
I hope this article was helpful to you.