3

mod_security rule 960015 keeps catching Google and other good bots. I have the following in the vhost to prevent good bots from being caught:

SecRule REQUEST_HEADERS:User-Agent "Mail.ru" log,allow
SecRule HTTP_USER_AGENT "Mail.RU_Bot" log,allow

Same for Google and Yandex.

It works 99% of the times, however fails at other times for some really bizare reason, here are the logs example for Mail.ru bot:

Successfull:

217.69.134.79 - - [07/Mar/2014:10:17:13 +0400] "GET / HTTP/1.1" 200 189934 "-"
"Mozilla/5.0 (compatible; Linux x86_64; Mail.RU_Bot/Fast/2.0; 
+http://go.mail.ru/help/robots)"

[Fri Mar 07 10:17:13 2014] [error] [client 217.69.134.79] ModSecurity: Access 
allowed (phase 2). Pattern match "Mail" at REQUEST_HEADERS:User-Agent. 
[file "/etc/apache2/sites-enabled/xxx"] [line "28"] [hostname "xxx"] 
[uri "/"] [unique_id "UxlkaQp-d4EAABU9BSIAAAAV"]

And next minute it fails:

217.69.134.79 - - [08/Mar/2014:02:14:19 +0400] "GET / HTTP/1.1" 403 389 "-" "
Mozilla/5.0 (compatible; Linux x86_64; Mail.RU_Bot/2.0; +http://go.mail.ru/
help/robots)"

[Sat Mar 08 02:14:19 2014] [error] [client 217.69.134.79] ModSecurity: Access 
denied with code 403 (phase 2). Operator EQ matched 0 at REQUEST_HEADERS. 
[file "/usr/share/modsecurity-crs/activated_rules/
modsecurity_crs_21_protocol_anomalies.conf"] [line "47"] [id "960015"] 
[rev "2.2.5"] [msg "Request Missing an Accept Header"] [severity "CRITICAL"] 
[tag "PROTOCOL_VIOLATION/MISSING_HEADER_ACCEPT"] [tag "WASCTC/WASC-21"] 
[tag "OWASP_TOP_10/A7"] [tag "PCI/6.5.10"] [hostname "xxx"] [uri "/"] 
[unique_id "UxpEuwp-d4EAAEMnBFQAAAAE"]

I know the proper way is to do reverse lookups, however they slow down the website, and I want to have at least some security but as it is at the moment cant use the 960015 because it blocks Google and others. In the same time it is a very usefull rule that caught 100s of bad bots.

If someone knows how to set it up with reverse lookup that will actually work and allow Google and other good bots to index - you are welcome to post here. However I am also looking for a quick and dirty solution to make it work right now, since some security is better then no security.

Vlad
  • 61
  • 2
  • 8
  • After further investigation I can tell that the only line that is relevant is `SecRule REQUEST_HEADERS:User-Agent "Mail.ru" log,allow` however the problem is that Header is empty when it fails, so I sort of need to add a second condition so that rule said something like `if headers empty and body has Google nolog,allow` I will research if this is possible if not will modify actual rule code. – Vlad Mar 08 '14 at 04:45
  • Here is the rule in question that needs editing to allow Google bot: `SecRule REQUEST_METHOD "!^OPTIONS$" \ "skipAfter:END_ACCEPT_CHECK,chain,phase:2,rev:'2.2.5',t:none,block,msg:'Request Missing an Accept Header', severity:'2',id:'960015',tag:'PROTOCOL_VIOLATION/$ SecRule &REQUEST_HEADERS:Accept "@eq 0" "t:none,setvar:'tx.msg=%{rule.msg}',setvar:tx.anomaly_score=+%{tx.notice_anomaly_score},setvar:tx.protocol_violation$ ` – Vlad Mar 08 '14 at 05:18
  • I have the same problem, but about the rule that fixed your issue: SecRule REQUEST_HEADERS:User-Agent "Google|Mail|Yandex" "phase:1,t:none,allow,nolog,ctl:ruleRemoveById=960015" Isn't it a flaw by itself? How can u be sure that the attackers will not abuse this by sending a fake User-Agent String? Especially when u have announced it publicly. However, I suggest you omit the action "allow", this might be a remedy. Because by using action allow you are opening every thing for attackers. action "ctl:ruleRemoveById=960015" might just let the attacker skip from the rule no 960015, but allow, Its to – Ehsan Mahdavi Jan 19 '15 at 09:03
  • Replace "allow" with "pass" in my rule, will monitor the situation. – Vlad Jan 27 '15 at 23:49

2 Answers2

3

First a disclaimer: I'm the author of Bad Behavior, a similar product, and some of the ModSecurity core rules were derived from Bad Behavior.

RFC 2616 states that the Accept header SHOULD be present in all requests. Note that this isn't an absolute requirement, so a user-agent is still conditionally compliant (as defined in the RFC) if it doesn't send this header.

The rationale for denying requests without an Accept header is that all regular web browsers do send the header, while many bots do not. In practice, though, after seeing millions of requests, some "good" bots don't send the Accept header either. So this rule is not perfect and does generate false positives.

Bad Behavior does not block these unless the request is a POST request. This cuts down on spam and reduces false positives to approximately zero, but still passes other bots. In my experience, many of those get caught by other rules anyway.

In your situation I would just disable this rule. It isn't buying you quite as much as you seem to think. If you want, you can modify it so that it only applies to POST requests.

Michael Hampton
  • 237,123
  • 42
  • 477
  • 940
  • This rule bans 100 bad scanner/ hacker tool ips per day on average, and my site isn't very popular yet, I imagine this number will be in the 1000s. So Yes it does buy me lots of brownie points and I definitely do want to keep it running combined with Fail2ban. – Vlad Mar 08 '14 at 00:28
0

Here is a modified rule that fits the purpose, been running for 48 hours now, Google and others work fine while baddies still get caught, wohoo!

Add this to the vhost in question:

SecRule REQUEST_HEADERS:User-Agent "Google|Mail|Yandex" "phase:1,t:none,pass,nolog,ctl:ruleRemoveById=960015"

2015 Update with a more recent situation - scammers have wised up and now mostly send fake headers pretending to be Google, different security strategies required.

Vlad
  • 61
  • 2
  • 8
  • Where do I add this? I'm new to this. I've just installed and enabled OWASP on my server and I'm seeing these 960015 rule IDs, but I'm not following where to make this adjustment..?? – Drew Angell Jun 09 '15 at 12:53