1

Perishable Press publishes a managed user agent blacklist frequently, which blocks bad bots which htaccess directives.

Article to be found here: http://perishablepress.com/press/2009/03/29/4g-ultimate-user-agent-blacklist/

Would you recommend using such a blacklist? The main goal is to reduce request on my servers, because I see a lot of traffic is consumed by bots visiting my site.

Cheers

Chrisissorry
  • 139
  • 7

2 Answers2

4

You're better off using robots.txt to control the well-behaved bots, and request rate-limiting for evil bots. You can't use anything controlled by the client for badly-behaved bots, because they can modify themselves to evade your filters.

womble
  • 95,029
  • 29
  • 173
  • 228
0

Generally yes if it meets your requirements/needs, unless there are specific conflicts you can find with their definitions and/or approaches. The prudent route is to monitor very closely during its initial implementation to ensure it fits your needs without introducing any unanticipated/unexpected side-affects.

user48838
  • 7,393
  • 2
  • 17
  • 14