0

I have some rules in nginx to block trafic: bad bots/user agents, ips, wp-logins. All rules block with return 444

fail2ban is listing nginx logs, but can't diferenciate the rules all rules are 444.

I need any trick to differentiate the nginx rules inside the log to apply diffrent blocks with fail2ban. Is it possible?

David
  • 101
  • 3

1 Answers1

0

Rather than making your webserver return 444, configure these nginx locations to not log anything. That way the CPU and disk IO saved writing the logs can be used for your real visitors.

CPU/IO time is also saved when you don't need fail2ban to scan through the logs.

Every real visitor is saved from being subject to IP/nftables rules slowing down their access.

You'll also be saved the anguish of looking at the logs and focusing on the background noise of the internet rather than the real visitors you care about.

danblack
  • 1,179
  • 10
  • 14
  • PS, don't take me for a fail2ban hater, I was a core maintainer for a number of years before the number of requests like this where too much and I quit. – danblack Sep 15 '18 at 11:13
  • Interesting, but: Not all the requests that bad crawler launch will be identified with my rules, but I only need to identify one to block with fail2ban for a lot of hours or for all the life. Crawlers come back after a minutes and repeat and repeat and repeat scans. It has a CPU cost for the server to process non identified scans. 444 in the logs provide to me a simple way to block in iptables low level rule and save the CPU costs. But when I view logs is difficult to know the fact was happens in a blocked line. – David Oct 04 '18 at 20:26