0

I'm moving towards HA using it to replace also NGINX and I've a question regarding how to do a Rate Limiting in HA that enables queuing the request instead of closing them.

I was able to limit per IP following those examples: https://www.haproxy.com/blog/four-examples-of-haproxy-rate-limiting/ . However, when the limit is reached, the users see the error and connection is closed.

Since I come from NGINX, it has this handy feature https://www.nginx.com/blog/rate-limiting-nginx/ where connection that exceed the threshold can be closed, but in general they are queued. Thus the user will still be able to do the calls but delayed without him getting errors and keep the overall number of requests within threshold.

Is there anything similar in HA? It should limit/queueing the user by IP.

To explain with an example, we have two users Alice, with ip A.A.A.A and Bob with ip B.B.B.B The threshold is 30r/minute.

So in 1 minute:

  • Alice does 20 requests. -> that's fine
  • Bob does 60 requests. -> the system caps the requset to 30 and then process the other 30 later on (maybe also adding timeout/delay)
  • Alice does 50 request -> the first 40 are fine, the next 10 are queued.
  • Bob does 20 requests -> they are queue after the one above.

is this possible?

EsseTi
  • 225
  • 3
  • 12
  • What is the haproxy configuration? Especially timeout queue and per client tables. To maintain good response time, typically queued is only for "a few" seconds. – John Mahowald May 30 '20 at 03:31
  • Sorry, i did not gate much out of this. can you please provide some info regarding how to do that? in nginx it's very easy to set, while in HA it seems that doing that per client is very difficult. – EsseTi Jun 07 '20 at 14:32

0 Answers0