8

I've got a scripter who is using a proxy to attack a website I'm serving.

I've noticed that they tend to access the site via software with a certain common user agent string (i.e. http://www.itsecteam.com/en/projects/project1_page2.htm "Havij advanced sql injection software" with a user_agent string of Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727) Havij). I'm aware that any cracking software worth it's salt will probably be able to modify it's user agent string, but I'm fine with the scripter having to deal with that feature at some point.

So, is there any software out there for automatically blocking access & permanently blacklisting by matching user agent strings?

Kzqai
  • 1,278
  • 4
  • 17
  • 32

2 Answers2

18

you can deny access by BrowserMatch and Deny from SetEnvIf Example:

SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
<Directory "/var/www">
        Order Allow,Deny
        Allow from all
        Deny from env=bad_bot
</Directory>

To permanenly block them you have to write custom log file and use fail2ban for example to ban them with iptables

For example create LogFormat

LogFormat "%a %{User-agent}i" ipagent

Add logging to your vhost/server-wide

CustomLog /var/log/apache2/useragent.log ipagent

/etc/fail2ban/filter.d/baduseragent.conf

[Definition]
failregex = ^<HOST> Mozilla/4\.0 \(compatible; MSIE 7\.0; Windows NT 5\.1; SV1; \.NET CLR 2\.0\.50727\) Havij$

/etc/fail2ban/jail.conf

[apache-bad-user-agent]

enabled  = true
port     = 80,443
protocol = tcp
filter   = baduseragent
maxretry = 1
bantime  = 86400
logpath  = /var/log/apache2/useragent.log
Dmytro Leonenko
  • 454
  • 1
  • 7
  • 24
  • Yeah, that would block bad user agent strings, but I'm looking to take it a step further and auto-ban ips associated with the bad user agent strings, such that use of the user-agent string gets you banned from that point onwards on that ip. I'd use fail2ban but I haven't found a way to nicely apply that to http requests at this point. – Kzqai Mar 25 '11 at 21:50
  • Try my suggestion in edited post – Dmytro Leonenko Mar 25 '11 at 22:39
6

I think i understand your question. I will provide a more detailed explanation should this be what you are looking for. (this will work as a trap for other things as well)

  • Enable the mod_rewrite engine in apache2
  • Create a trap.php, visiting can do whatever you like. For example, i made it add all visitors ip to a blacklist that denies access to my web.
  • Create a file of the useragents you dont like, one per line like this
    bas_useragent [tab] black
    useragent_bad [tab} black
  • Now, add your mod_rewrite that matches the map from bad useragents, then reqrites to your trap if there is a map. The rule may look like this:

    RewriteMap badlist txt:~/bad_useragent_list
    RewriteCond %{HTTP_USER_AGENT} .* [NC]
    RewriteCond ${badlist:%1|white} ^black$ [NC]
    RewriteRule (.*) "/trap.php" [L]

  • This basically matches the useragent to the keys in your file, if it isnt found it is assumed to be "white" and the request is unmodified. If it is found, and the associated value is "black" the request is rewritten to go to your trap.php file, which does whatever you like it to.
  • Some possible ideas. Have another script watching a common file that trap.php writes an IP to. If this common file changes, this watcher reads the new info, parses out IP addresses, and adds a rule to IP tables that blocks all traffic from that address. I hope this helps! Again, if you would like more detail, just reply here.
blerbl
  • 61
  • 1
  • 1
  • almost forgot, you can read about mod_rewrite here: http://httpd.apache.org/docs/current/mod/mod_rewrite.html – blerbl Oct 28 '12 at 21:00