3
  • What would be the recommended approach to block those typical bogus HTTP requests that the server gets bombarded with?
  • Also has this attack/nuisance been named yet or is it generalized as Bot Activity?

While all of them lead to a 404 Not Found, CPU cycles are wasted in processing these requests and affects access by legitimate users at least for a short while.

Y123
  • 458
  • 4
  • 16

2 Answers2

4

While all of them lead to a 404 Not Found, CPU cycles are wasted in processing these requests ...

Somewhere CPU cycles "must be wasted" to filter out these requests. But it depends on the kind of requests and of your server and application setup how much cycles this will be and where exactly they will be needed.

If there is a clear source of these requests you might use simple packet filter rules (iptables, ipfw or some router in front of the server) to block such requests already at the transport layer by filtering based on the source IP address. This would be the cheapest way, i.e. less cycles are wasted. But in most cases you don't have such a clear source so filtering must be done at the application layer, which is more complex and thus needs more CPU cycles. That might be done with a web application firewall (WAF) in front of the server, by filtering rules at your web server or by filtering these requests inside your web application.

... and affects access by legitimate users at least for a short while.

While any server on the internet gets lots of requests they are usually not that much that it really affects the applications, i.e. they are more a nuisance and do not amount to a denial of service. If handling such requests is too costly for you than you might need to rethink the design of your web application, like make sure that bad requests are filtered out early and don't cause database lookups or other costly operations.

For more details about this problem see How can I defend against malicious GET requests?

Steffen Ullrich
  • 184,332
  • 29
  • 363
  • 424
2

I have found that these automated, undirected scans are usually scanning IPs randomly and do not include a Host: header or include a bogus Host: header. Filtering out requests with bad Host headers can reduce the nuisance logging a bit and possibly reduce the impact on your server, depending on the server's architecture.

This doesn't make your server much more secure, because a more dedicated attacker can simply include the correct header. But it does eliminate a lot of chaff.

(It can have a security benefit by preventing DNS rebinding attacks, though.)

Wim Lewis
  • 271
  • 1
  • 3