-1

I want to block some bad search engine bots like MJ12bot, YandexBot and Ezooms. We have like 200 users in the directadmin environment, and we want to install a "plugin" or "mod" to block those request because we dont't want to go to every site to add it to the .htaccess.

Is there a plugin or mod for this for directadmin, and what is it called?

Thanks!

  • You may get better answers at our sister site [Pro Webmasters](http://webmasters.stackexchange.com/). Be sure to search for existing questions before asking, as this topic is very commonly discussed there. – Michael Hampton Dec 18 '12 at 17:33

1 Answers1

1

Create robots.txt file with the following contents under (/var/www/html):

User-agent: YandexBot
User-agent: Ezooms
Disallow: /

Add the following to your httpd.conf file

Alias /robots.txt /var/www/html/robots.txt

This robots.txt file will now be served for all virtual hosts on your server, overriding any robots.txt file you might have for individual hosts.

Ahmed Ossama
  • 381
  • 2
  • 3
  • but if i want a custom robots.txt, this will override them? Also, i have the feeling that this kind of bots not abay the robots.txt, thats why i want to block them completely. – Michel Bardelmeijer Dec 18 '12 at 13:14
  • @MichelBardelmeijer If the bots don't obey robots.txt your only option is IP/Subnet/AS level blocking (fine-grained, maintenance-intensive --> blocking huge swaths of the internet, but not much maintenance). – voretaq7 Dec 18 '12 at 18:00