1

I want to protect a couple of Apache webservers against forceful browsing. Lately there have been multiple scans querying for non-existent applications on them like phpMyAdmin, notably known for their weak security record.

This is starting to decrease the performance of said servers and I'm searching for ways to protect against them. I've found a couple threads in here about this:

From that, I gather my best bet is to throttle these requests.

I was wondering if there is some system like a DNSBL for spam fighting used for this effect. Maybe a global database of "attack queries" which could be used in conjunction with netfilter's conntrack or state module?

tomeduarte
  • 345
  • 1
  • 9
  • 2
    You've mixed multiple questions here into the same place. (1) How do I ensure that my web application is free of forceful browsing vulnerabilities? (2) Bad guys scanning my server for vulnerabilities are hurting its performance; how do I improve performance? Those two questions have two very different answers. You should post them separately, for best answers. – D.W. Aug 02 '11 at 19:42
  • I agree with @D.W. - you will be likely to get better responses if you split out the two questions as they are very distinct from each other. – Rory Alsop Aug 04 '11 at 08:17

5 Answers5

3

Select some popular non-existing folders on your server and install tarpits. Tell your friends to do the same.

Edit: It helps by slowing down the bots that scan you, so you'll have plenty of idle time regardless how many are scanning you. As for what it is and how to do it: Google maintains a regularly updated list of links at the address provided above.

pepe
  • 3,536
  • 14
  • 14
2

You're thinking about this the wrong way. Forceful browsing is a vulnerability in the web application. Preventing forceful browing requires the web application to be coded properly. If the web application has a forceful browsing vulnerability, then you're not likely to be able to fix it with Apache configuration.

Basically, forceful browsing is not related to faulty Apache configuration; it is related to faulty software development. The folks who are responsible for avoiding this problem are the software developers, not the network ops folks.

P.S. A DNSBL or similar list is going to be useless for defending against forceful browsing vulnerabilities.

D.W.
  • 98,420
  • 30
  • 267
  • 572
  • 1
    The problem noted in the question is the performance hit from "forceful browsing". The looked-for solution is a way to block "forceful browsing" not because of a security concern, but because of a performance concern. – yfeldblum Aug 02 '11 at 19:55
  • 1
    @Justice, As I commented elsewhere, the question is vague and ambiguous, and conflates multiple questions. I have a separate response that addresses the performance hit; this response addresses the security concern. See the other one for how to address the performance concerns. – D.W. Aug 02 '11 at 22:04
2

It looks like you want a Web Application Firewall (WAF). From OWASP "A web application firewall (WAF) is an appliance, server plugin, or filter that applies a set of rules to an HTTP conversation." For example, it could have a rule that throttles requests from an IP after 3 sucessive 404 errors in a minute. The OWASP link also recommends selection criteria for choosing a WAF. There are no-cost and commercial software applications as well as network hardware that fufills the Web Application Firewall role.

A cursary search reveals:

IronBee

ModSecurity

dotDefender

openWAF

BinarySec

this.josh
  • 8,843
  • 2
  • 29
  • 51
1

To answer your question about improving performance, there's a very simple answer: just blacklist the most common queries you are getting that are hurting your performance.

Grab logfiles for a month or so. Collate the most common attack requests. Something like the following will probably work: grep for requests that return 404 Not Found, find all unique URLs, count how many times each unique URL was accessed, and sort. Take all such commonly-accessed non-existent URLs, and put them in a blacklist. Now configure Apache to immediately block all attempts to access any URL on that blacklist.

A slightly more sophisticated version of this is to collate the most commonly attacked applications (like phpMyAdmin), according to your logs. For each, check whether you have any copy of that application legitimately deployed on your site. If you do not, then add it to a blacklist. Now configure Apache to immediately block all attempts to access any applicationn on that blacklist. This might be a bit broader, but it might take a little bit more work to collate the blacklist to ensure you're not inadvertently blocking any legitimate application on your site.

For this, I don't think you need a centralized blacklist. Your logs should have all the information needed to identify the most commonly accessed non-existing URLs/applications.

Do note that this approach might improve performance but it is unlikely to appreciably improve security.

D.W.
  • 98,420
  • 30
  • 267
  • 572
1

Just blackhole them. Anybody checking /phpMyAdmin on your server isn't playing nicely, and isn't there to actually use it. If somebody hits that, or any other common scanning location, just drop them for the next hour or day. Now, this can present some DOS potential, but that would take some serious work and would be easily reversible.

I find great value in using failed attempts to kill possible probing.

Jeff Ferland
  • 38,090
  • 9
  • 93
  • 171