2

Occasionally I flip through our (apache) access log and I often come across people trying to fish for admin pages. For example, they are trying to access pages like:

/wp-login.php
/administrator/index.php
/admin.php
/user

None of these pages/directories actually exist as they never have or I've renamed them to something not too obvious.

So, do people actively block these kinds of requests? I sometimes do, but I get a lot, and I'm wondering if it actually makes any different. Currently I block the host or IP in my httpd.conf file.

Am I worrying over nothing?

(FYI - running Apache on a 'linux' based server).

EDIT: From the perspective of 'sub-enterprise' (small-medium business level).

AD7six
  • 2,810
  • 2
  • 20
  • 23
Hubbo
  • 23
  • 3
  • 4
    You can run something like [fail2ban](http://www.fail2ban.org/wiki/index.php/Main_Page) to block based on log activity. But do realize most of those "people" you want to block aren't. Bots gonna bot. – jscott Mar 05 '15 at 01:13

3 Answers3

2

There are some advantages to blocking such requests at the firewall level, namely:

  • IPTABLES takes less resources to block the connection than Apache does to return an error.
  • It will keep your log files smaller and cleaner.

You mentioned blocking the hosts in httpd.conf. This is not as useful.. The bots are receiving a 404 anyway, it doesn't benefit you nor harm them any to send a different error code (which is all blocking in httpd.conf is going to do)

The only benefit to blocking in httpd.conf is that a malicious host that is trying to hack a non-existent page now, might try to hack a real page later. Of course, you can obtain the same benefit by blocking at the firewall level, along with the other advantages mentioned above.

tl;dr: Block the requests, with fail2ban, not with httpd.conf

Joe Sniderman
  • 2,749
  • 1
  • 21
  • 26
0

Some people block these requests. However, to do so is kind of pointless, since they are harmless. Crawlers crawl the Internet looking for things all the time, whether things to hack or things to index. These types of scans aren't worth the trouble of blocking, and your blocking might actually end up preventing indexing or catching a real user sharing an IP with such a bot (rarely).

If the log entries are bothering you, you should probably be using a log aggregator; watching logs sucks, and you'll miss things, and you'll fixate on irrelevant things like this.

Falcon Momot
  • 24,975
  • 13
  • 61
  • 92
-1

At our organization, we use a web application firewall (the Application Security Module [ASM] from F5) to block these kinds of requests. It works by first learning a database of acceptable URLs. It is intelligent enough to figure out all of the static links as well links which may be variable. For example, if a token or unique ID is included with a request, it will enforce limits on the length of that in the URL based on what it has seen in the past. Furthermore, it has a regularly updated database of other things it watches out for, such as SQL or BASH in a URL or user agent string.

While you may not be vulnerable right now to the types of attacks you are seeing, you cannot guarantee you will not be vulnerable in the future or that you are vulnerable to something that you simply have not seen yet or is yet undiscovered. Furthermore, something like this solution can help to guard against DDoS attacks. You know that you are regularly being targeted and this does incur some cost in processing on your infrastructure that could be lessened. If you know you are regularly being targeted either 1) someone is really trying very hard to get in there, 2) you are an obvious target or 3) you are a high value target. Most commonly it is 2) and most commonly it is from a wide range of source IPs and therefore not easily blocked.

As to whether it is worth the money to do so, that is really a business decision. You must weigh the cost of a potential breach against the cost of the solution or solutions, much like having an insurance policy. It may not cover all situations and it may never pay off to have it. Or it may be the thing that saves you.

James Shewey
  • 182
  • 14
  • A learning system sounds like a great solution. However, it appears more Enterprise level - a quick google and prices of $15k are bouncing around. I should have specified this is more for small-medium business sized website. – Hubbo Mar 05 '15 at 23:23