25

I occasionally get clients requesting I look at their access_log file to determine if any web attacks were successful. What tools are helpful to discern attacks?

Tate Hansen
  • 13,714
  • 3
  • 40
  • 83

8 Answers8

16

Yes you can, apache log gives you information about people who visited your website including bots and spiders. patterns you can check:

  • someone made multiple requests in less than second or accepted time frame.
  • accessed secure or login page multiple times in a one minute window.
  • accessed non existent pages using different query parameters or path.

apache scalp http://code.google.com/p/apache-scalp/ is very good at doing all the above things

Mohamed
  • 1,404
  • 1
  • 11
  • 14
  • My experience using Scalp is that its output contains a lot of false positives and is missing a lot of true positives. It also doesn't distinguish between successful attempts and unsuccessful attempts. Both of these are controlled by the default_filter.xml file that comes from the [PHP-IDS project](https://phpids.org/) so it should get better as that set of filters is improved. That said, using it was much better than trawling through the log files by hand. Parsing the log file was also quite slow although I understand they are rewriting it in C now which might help. – Ladadadada Nov 18 '11 at 17:20
  • @Mohamed Any idea of where I can find a list of attack patterns for detecting attacks in apache logs please. Those you mentioned are great for starting, but any others? – user1724140 Mar 24 '13 at 12:53
5

Log analysis won't cover all attacks. For example, you will not see attacks that are passed through POST requests. As an additional protection measure can serve IDS/IPS.

5

mod_sec can detect just about anything, including inspection of POST requests.

You could even load up snort ids rules into it and block these requests on the fly before they hit applications

Eric G
  • 9,691
  • 4
  • 31
  • 58
Troy Rose
  • 141
  • 1
5

As Ams noted, log analysis won't cover all attacks and you won't see parameters of POST requests. However, analyzing logs for POST requests sometimes is very rewarding.

Specifically, POSTs are popular for sending malicious code to backdoor scripts. Such backdoors can be created somewhere deep in subdirectories or a backdoor code can be injected into a legitimate file. If your site is not under a version control or some other integrity control, it may be hard to locate such backdoor scripts.

Here's the trick:

  1. Scan your access logs for POST request and compile a list of requested files. On regular sites, there shouldn't be many of them.
  2. Check those files for integrity and legitimacy. This will be your white list.
  3. Now regularly scan your logs for POST request and compare requested files with your white list (needless to say you should automatize this process). Any new file should be investigated. If it is legitimate - add it to the whitelist. If not - investigate the problem.

This way you'll be able to efficiently detect suspicious POST request to files that normally don't accept POST requests (injected backdoor code) and newly created backdoor files. If you are lucky, you can use the IP address of such requests to identify the initial point of penetration or you can simply check log around that time for suspicious activity.

Denis
  • 81
  • 4
4

apache-scalp can check for attacks via HTTP/GET:

"Scalp! is a log analyzer for the Apache web server that aims to look for security problems. The main idea is to look through huge log files and extract the possible attacks that have been sent through HTTP/GET"

Tate Hansen
  • 13,714
  • 3
  • 40
  • 83
  • 3
    Yes, apache-scalp is pretty neat. You can also convert IIS log format to Apache log format and then use this same tool. – atdre Nov 14 '10 at 13:37
4

Check out WebForensik

It's a PHPIDS-based script (released under GPL2) to scan your HTTPD logfiles for attacks against web applications.

Features:

- supports standard log formats (common, combined)
- allows user-defined (mod_log_config syntax) formats
- automatically pipes your web logs through PHPIDS
- categorizes all incidents by type, impact, date, host...
- generates reports in CSV, HTML (sortable table), XML
guy_intro
  • 41
  • 1
  • 1
1

It may be better to scan your database plan cache (and/or log files) than your web server logs, although certainly it would be good to combine these techniques and match up time and date stamps.

For more information, please see the book by Kevvie Fowler on SQL Server Forensic Analysis.

atdre
  • 18,885
  • 6
  • 58
  • 107
1

Try LORG -> https://github.com/jensvoid/lorg. It has different detection modes (signature-based, statistics-based, learning-based), some nice features like geomapping, DNSBL-lookups and robot detection (= was the attacker a man or a machine?).

It can make a guess on the success of attacks by looking for outliers in the 'bytes-sent' field, HTTP response codes or active replay of attacks.

Code is still pre-alpha, but under active developement.

Adam Smith
  • 11
  • 1