0

I want to create a .htaccess that does the following things:

  1. Routes all files through to /index.php
  2. Forces use of SSL
  3. Doesnt allow robots to load individual PHP files. For instance, if a robot wants to load /pages/folder/file.php It won't allow it. I'm constantly getting error files because robots load those pages and the database variable isnt set or something. However I need my website's javascript to be able to access it.

So far I've for the first two down, (I think):

        #Redirect all quries to SSL port 443
        RewriteCond %{SERVER_PORT} !^443$
        RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

        # Don't allow www, redirect to https://
        RewriteEngine On
        RewriteBase /
        RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC]
        RewriteRule ^(.*)$ https://%1/$1 [R=301,L]

        # Route everything via index.php
        RewriteBase /
        RewriteRule ^index\.php$ - [L]

        RewriteCond %{REQUEST_FILENAME} !-f
        RewriteCond %{REQUEST_FILENAME} !-d
        RewriteRule . /index.php [L]

Is this correct? Is it in the right order? How can I achieve point three?

Chud37
  • 123
  • 7
  • 1
    It is next to impossible to identify a robot correctly. it is best to set a robots.txt and live with it. – Gerald Schneider Jun 22 '17 at 07:09
  • Also, the issues you have are a result of problems in your code - a robot follows links it sees and if there are links on your site that creates errors if you follow them, then that's a bug in your code. – Sven Jun 22 '17 at 07:14

0 Answers0