6

I'm using some popular commercial forum software and to help secure the admin area more I have changed the name of the admin folder; however I was wondering, with regards to robots.txt if I go ahead with putting:

User-agent: *
Disallow: /random_admin_name/

Then wouldn't I just be exposing the "secret" admin name now!?

So my question is - how do I stop the admin from being indexed and found by someone else?

Brett
  • 279
  • 2
  • 7

4 Answers4

6

Renaming it to something, then shoving that name into robots.txt just shows an attacker what directory you want to protect. Think of the following: "I will put my jewels into the third drawer, but leave a sign for a robber that says: 'don't look in the third drawer!'"

Your real option to protect you here, is to place an .htaccess file in the directory you DON'T want accessed, and in that .htaccess file, ONLY give yourself the option to view that directory

Order Deny,Allow
Deny from all
Allow from 127.0.0.1 (where this is your IP)

Or you can add mod_security and add a similar re-write.

munkeyoto
  • 8,682
  • 16
  • 31
3

how do I stop the admin from being indexed and found by someone else?

From Using meta tags to block access to your site you can add the following to your <head> section of the page:

<meta name="robots" content="noindex">

However, this is regarded as Security through obscurity. This is not bad in itself, but be aware that it does not add any real security to your admin site. This will stop it being indexed by compliant search spiders, and will stop an attacker from reading robots.txt to find it. However, you should make sure your admin system is otherwise as secure as possible (ideally externally tested) and locked down (for example, by IP address or accessible via VPN only).

SilverlightFox
  • 33,408
  • 6
  • 67
  • 178
2

Don't worry about it being found by someone else. Having a secret URL is just security through obscurity. Feel free to disallow robots, since you don't want it crawled, but use accounts/roles/etc to actually secure the page.

You can't design a secure system and have the hope that someone doesn't stumble across a secret web page that undermines everything. Assume they will find it, and design for that. That means using a strong password, detecting/blocking malicious users, HTTPS://, etc.

Gray
  • 728
  • 4
  • 15
  • Yeah.... I usually use IP white-listing for areas such as this; but sometimes it can be a pain when someone that needs access has dynamic IP's and also wants to access from several places ha! Thanks! – Brett Apr 24 '14 at 15:30
1

Robots.txt will not stop bad robots, only search engines. make a .htaccess file and enter

 Order Allow,Deny
 Allow from 192.168 #LAN IPs
 Allow from 10. #LAN IPs
 Allow from 127. #Localhost
 Deny from All
 #Uncomment the line below to show a custom 403 forbidden message
 #Seen when users not Allowed try to visit
 #ErrorDocument 403 "You are not allowed here"

Click save.

anonman
  • 50
  • 3