0

I have an ubuntu 10.04 server where I installed mod_evasive using apt-get install libapache2-mod-evasive

I already tried several configurations, the result stays the same.

The blocking does work, but randomly.

I tried with low limis and long blocking periods as well as short limits.

The behaviour I expect is that I can request websites until either page or site limit is reached per given interval. After that I expect to be blocked until I did not make another request for as long as the block period.

However the behaviour is that I can request sites and after a while I get random 403 blocks, which increase and decrase in percentage, however they are very scattered.

This is an output of siege, so you get an idea:

HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.11 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt

The exac limits in place during this test run were:

DOSHashTableSize 3097
DOSPageCount 10
DOSSiteCount 100
DOSPageInterval 10
DOSSiteInterval 10
DOSBlockingPeriod 120
DOSLogDir /var/log/mod_evasive
DOSEmailNotify ***@gmail.com
DOSWhitelist 127.0.0.1

So I would expect to be blocked at least 120 seconds after being blocked once.

Any ideas aobut this?

I also tried adding my configuration at different places (vhost, server config, directory context) and with of without ifmodule directive...

This doesnt change anything.

The Shurrican
  • 2,230
  • 7
  • 39
  • 58

1 Answers1

2

This issue can be caused by using apache in prefork mode, each counter of mod_evasive are not shared between process.

Post about it on serverfault

profy
  • 1,126
  • 9
  • 19
  • Wow they didn't fix that? that's a hell of a bug – msEmmaMays Sep 01 '12 at 20:33
  • It's a conception problem – profy Sep 01 '12 at 21:01
  • ineed i am running in prefork mode. is there anything i can do about this? any similar module? i am primarily having problems with users trying to brute force my login page or requesting a lot of pages to overload the server regarding cpu/database load. so this module would be imo more suitable than a network layer ddos protection. – The Shurrican Sep 02 '12 at 07:15
  • Maybe mod_security has anti ddos features, but setup is not so simple. – profy Sep 02 '12 at 09:13