I'm trying to implement a throttling feature on nginx, that is shared across multiple servers across multiple datacenters. I would like to know what would be the best practice for building this.
For example, let's say that I have an HTTP API running on two cluster of servers (behind a load balancer) located in two different datacenters. I would like to throttle a developer by his api-key to 1000 requests/hour. The developer has built a mobile application, which means that depending where his final users are, requests will be server by both locations (the closest datacenter).
How would you enforce throttling in this particular scenario?
The easiest way would be to implement throttling in each of N data centers separately. In your case M=1000 requests/hour and N=2 data centers. So, just use M/N=500 as your throttle value.
See: NGINX - throttle requests to prevent abuse