6

We having blocked some bad bots which relentlessly try to access our site via "Deny" directives in NGINX. It's not possible for us to block it at the firewall, as we use a load balancer that we have no access to from our cloud provider.

This directive works fine, but our nginx error.log file gets flooded by these requests. We have them rotated, but it's poor for us as there are so many requests, we can't actually use the error log to see if there are any real errors that we should be mindful of.

So a few questions:

  1. How do we prevent nginx from sending these errors to the error.log.
  2. Why does nginx consider this an error? If you are specifying a deny directive and the ip is denied, from an http perspective this is a successful 403 response and should not be considered an error at all (imo).
  • Further, to anyone else whos interested, my current temporary solution is to run the following (which simply parses out all these messages): `tail -f /var/log/nginx/error.log | grep -v "access forbidden by rule"` – Scott Johnston Jun 06 '16 at 21:23

3 Answers3

0

Add

access_log  /dev/null;
error_log /dev/null;

to the deny directive level. This should prevent those entries.

unNamed
  • 523
  • 2
  • 10
0

In such a case, a better solution will be to use geo block with if to reject requests like:

geo $blocked {
    default 0;
    1.1.1.1/32 1;
}
...
server {

  if ($blocked) {
    return 444;
  }
}
pva
  • 150
  • 4
0
  1. Depending your configuration, you can add a directive in your context (error_log)
    Example with access_log but it's the same with "error_log off;"
location = /robots.txt {
    log_not_found off;
    access_log off;
}

If you can't, use grep command (or similar) ?!

  1. No idea. Maybe it's a question of layer. IP addresses are blocked by layer network ?!
BeWog
  • 1
  • 2