37

I have a private (= I'm the only user) site at example.com/private/. Nothing else is published at this host (it's my domain).

I don't want anyone to know that there is anything at example.com, especially not my private site.

Now let's say Alice "guesses" the URL and visits example.com/private/. Of course I have protected the site by requiring a login, but still, I don't want Alice to know that there is such a site at all, and I don't want her to try out the login etc.

I wonder if the following method could help me here:

With the Firefox add-on RefControl I can set a custom Referer header to use only for requests to a certain host.

I set the Referer to (e.g.) 9b2389Bqa0-ub712/bauUU-UZsi12jkna10712 for any request to example.com.

Now I check with .htaccess (don't know how it is possible exactly, but I heard it should be see the question on Code Review SE) for the visitor's Referer:

  • if it is 9b2389Bqa0-ub712/bauUU-UZsi12jkna10712, do nothing special (= access to the site is possible)
  • if it is something else, send HTTP error 404

I guess instead of 404 a fake site could be shown, but for my case I want no one to see a difference between /private (which exists) and /foobar (which doesn't exist → 404).

Would this work? Is it possible with .htaccess? Has this method any flaws? Anything I should change? Any similar methods?

Updates & clarifications

  • The whole host uses HTTPS.
  • I don't need to hide the fact that there is a server, I just want to hide that there is content served.
  • Thanks @Adnan for bringing the term "plausible deniability" into play → I want this (client-side)!

  • Some proposed using a hard to guess URL (e.g. the secret string appended to the URL): While this might certainly work, I think using the Referer method has the advantage that you can't accidentally reveal the secret (easily). URLs are more visible than HTTP headers: someone looking at your screen, published bookmark list by accident (not remembering that the secret URL is included), browser history, screenshotting your desktop with browser address bar in background, …. And so you have the initial problem again: how to hide that there is a site when Alice "guesses" (or whatever) the URL.

  • Some proposed using an external (even better, local) login page. I like that idea. In comparison to the Referer method (if it can be implemented by .htaccess, which I assume is possible) it has the disadvantage that you'd maybe need to adapt the site's code/CMS.

  • For discussion about the .htaccess, see the question on Code Review SE

  • As @НЛО points out, a custom header would be better. Yes, I think so, too. But I didn’t find a Firefox add-on for that yet (see question on Super User).

unor
  • 1,769
  • 1
  • 19
  • 38
  • 5
    This method has been used on a number of 'underground' sites for years; legit looking site, but with a specific referer on a certain page, you get access to the real site. – Adam Caudill Apr 08 '13 at 12:48
  • 1
    Also consider using custom HTTP-header instead of Referer. Referer is meant to be logged by default, hence providing Alice (should she obtain logs) full info about obscurity scheme being used. – Evgeniy Chekan Apr 19 '13 at 10:37
  • @НЛО: Yeah, I thought about this, too, but didn’t find a Firefox add-on that would accomplish this. I just opened a [question at Super User](http://superuser.com/q/584918/151741). – unor Apr 19 '13 at 11:36
  • @unor [ModifyHeaders](https://addons.mozilla.org/en-us/firefox/addon/modify-headers/) allow any HTTP-headers magic :) – Evgeniy Chekan Apr 19 '13 at 11:57
  • @НЛО: In [ModifyHeaders](https://addons.mozilla.org/en-us/firefox/addon/modify-headers/), all custom headers are sent globally. I didn’t find a way to restrict it so that it only sends the custom header for requests to my host. – unor Apr 19 '13 at 14:20

9 Answers9

44

obscurity

I personally think you're doing alright. As long as your underlying login method is secure, add as many obscurity layers as you want.

I have worked with some clients that wanted the exact thing you're trying to achieve. I've always used one of these two methods:

  • Cross-Site login form: A local .html file that has a login form submitting to the example.com/private/login.php. The login form had a hidden field with some randomly generated token.

Of course anyone could've distributed the same .html file, but that wasn't important, because our main security method was the login process itself; the username and the password.

Any request to example.com/private was rejected and a 404 was shown to the visitor. But when a POST request with the correct credentials was received, a session is started the website worked normally.

The upside of this method is that it grants you plausible deniability. If someone gets hold of the .html file they can try to login and they'd still get 404. There's no proof that this file is related to an actual working website.

  • Extra password in the URL: Pretty much the same as your method, but the extra authentication token was in the URL. Any request to example.com/private was rejected and a 404 is shown to the visitor. But when a proper example.com/private/login.php?token=Zz37vQQCnLTpe527xeFfFEG9 is received the login form is viewed.

Most clients liked this method as they were able to bookmark that URL. However, the downside of this method is that it does NOT grant you plausible deniability. Anyone with the URL can prove that there's a working login form on that website.

To be honest, I like your method and I think I might use it someday.

IMPORTANT: All of the above applies only after you make sure that your login method is secure.

Adi
  • 43,808
  • 16
  • 135
  • 167
15

Your method is functionally equivalent to requiring authentication with two passwords, the Referer being one of them. A more common variant is to use a secret URL, i.e. to make the "special string" part of the path to the private site. Including the secret string in the URL may include some extra details to think about (e.g. users can bookmark it, meaning the string is written in a not-so-hidden file on the user's machine), but also bring some goodies (e.g. users can bookmark it, making it compatible with all kinds of browser, not just Firefox-with-an-extension). For usability, I tend to think that a secret string in the URL itself is a better trade-off, but that's your call.


I hope you are using HTTPS for your complete site. Indeed, if you use plain HTTP for both the public and the private sites, then passive eavesdropper may observe your communications with the private site, revealing your passwords and all your secrets. This is bad.

If you use HTTPS for just the private site, then you have a server which listens on port 443, and attackers can see it quite clearly (it is as simple as making a connection attempt on it). If the main, public site is HTTP-only, attackers will soon understand that there is a private site (one does not go to the trouble of buying and setting up a SSL server certificate just for the fun of it). If you want your private site to remain undetected, then you must use HTTPS for the public site as well (some people recommend it generically, for a variety of reasons, not all of them technical).


I don't know about the exact behaviour of the RefControl extension, but you'll want to make sure that it sends your special Referer string only to the HTTPS version of your site, not the HTTP (if it exists). There again, I would personally feel safer with a secret string in the URL, because at least I know when it will be sent and when it will be not.

Thomas Pornin
  • 320,799
  • 57
  • 780
  • 949
  • _"There again, I would personally feel safer with a secret string in the URL, because at least I know when it will be sent and when it will be not."_ **Gold** – Adi Apr 08 '13 at 15:52
6

If you do not want someone to know that there is a webserver at example.com unless some special URL is requested, then you have to implement that as a rule right at the IP firewall level. It has to look for SYN packets arriving for port 80, and look at the payload to see that it's a valid HTTP request for the permitted URL. If not, then just drop the packet.

Otherwise, if the request for the wrong URL gets through to the server, the server announces its presence by establishing the TCP connection (and responding with a 404).

If you're on Linux, you can use the iptables string module to do this kind of thing.

Take a look here: https://stackoverflow.com/questions/4628157/allow-connections-to-only-a-specific-url-via-https-with-iptables-m-recent-pot

Kaz
  • 2,303
  • 16
  • 17
  • Thanks for the info :) However, for my case it's sufficient that no content/site is visible. – unor Apr 09 '13 at 06:29
  • If you get a 404, then the site is visible. If you want the site to be visible, but not serve any content except to authorized users, then why don't you simply password protect it. – Kaz Apr 09 '13 at 07:15
  • I thought of the default server 404 page, the same that you'd get when visiting a non-existent resource like `/foobar`. – unor Apr 09 '13 at 07:48
  • 2
    A possiblity is some kind of web knocking. The user visits another URL on the server, or a pattern of certain URL's. The URL's all yield 404's, but the server makes a note of the pattern and then reveals the top level site to that client. – Kaz Apr 09 '13 at 08:26
  • 1
    @unor Your assumption is correct. By default, the web server will give `404` when resource doesn't exist. If I were poking around to check if you have something on your site and I get `404`s everywhere, I'll either think you're hiding something or I'll just conclude that you have nothing. But in both cases I'll have no proof that you _actually_ have something. – Adi Apr 09 '13 at 11:39
  • 2
    "It has to look for SYN packets arriving for port 80, and look at the payload to see that it's a valid HTTP request for the permitted URL" That's not possible. The SYN request *does not have a payload*. You have to actually accept the connection (respond with SYN|ACK) before the payload will come. – derobert Apr 10 '13 at 02:12
  • @derobert Oops, you're right. It's been a couple of years since I looked at TCP! I don't think this is doable: that is, to maintain complete "radio silence" while deciding based on content whether to take the request. Bummer. – Kaz Apr 10 '13 at 03:12
5

Yep, it's fine. As long as the password layer is secure, adding more layers of obscurity will help.

A few other tricks (mix and match):

  • Make the login page only accessible via an AJAX POST request (to log in you craft a POST request in the JS console)
  • Make it only work if you've accessed example.com/trigger?pwd=abcd (which is a 404 page as well) within the last minute.
  • Make the URL a function of the date and time. Something easy to calculate in a JS console but hard to guess
Manishearth
  • 8,237
  • 5
  • 34
  • 56
2

Assuming you have designed the site, best thing is always to add a clause in the start of the page "/private/" refers you to. The clause should be something like if user.authenticated=true Then "load your page" Else "Error 404"

Bare in mind my labelling is very general

The user.authenticated statement could change with your page languages but would simply check if alice had logged in already, if she hadn't she would get error 404 meaning she would have to go manually to example.com/home.php or login.aspx or whatever the login page is called.

chrisc
  • 13
  • 2
  • This, however, would still present the login screen to Alice, right? – unor Apr 08 '13 at 12:02
  • If you adopt Adnan's solution where the login screen is on a different public site, that's not a problem. – armb Apr 08 '13 at 12:36
  • Agreed just not so sue cross site scripting is indeed the answer, as then you need a damb strong encyption layer to avoid mitm attacks. – chrisc Apr 08 '13 at 14:44
2

One solution: Turn off directory indexing and put your content under 'example.com/9b2389Bqa0-ub712-bauUU-UZsi12jkna10712'. Bookmark this special URL. An attacker who tries a brute force enumeration attack will get genuine HTTP 404 error most of the time. In your solution the attackers may perform timing analysis and notice that 404s from '/private' are slower than those for '/nosuchpage'. In the solution I propose there is only one kind of 404 response.

Disadvantage: Ugly URLs. There are other disadvantages I can think of, but those are also present in your scheme.

  • It doesn't have to be anywhere near that ugly. The name of your childhood pet will suffice. Heck, it can be one character long. Make your server respond only to `example.com/x/`. If there is an attacker, that attacker must already know about `example.com` and the goal was to prevent that from happening in the first place! – Kaz Apr 09 '13 at 01:44
  • Would a timing analysis even work when I check *every* URL? It doesn't load the site until the Referer is the special one, so the load should be the same for all URLs (for users without this Referer), or am I wrong? – unor Apr 09 '13 at 06:35
1

If the attacker got "HTTP 401 Unauthorized" responses no matter what they tried to access, would that count as the attacker finding out that there is "something" at that site? The attacker already knows that that domain is registered and that there is a web server running on that host name.

If this solution suits you, use http://httpd.apache.org/docs/2.4/howto/auth.html

Disadvantage: you cannot claim "the site is empty, check for yourself". If you need that, you're in an interesting business. :-)

1

You can use this simple PHP code (create a index.php file):

<?php
 if (isset($_GET['passkey']) && $_GET['passkey'] == '9b2389Bqa0-ub712/bauUU-UZsi12jkna10712') {
   ?>
   ... your html page ...
   <?php
 } else {
   header("HTTP/1.0 404 Not Found");
 }
?>

if you want to access to your site you simple open the url 'www.example.com/?passkey=9b2389Bqa0-ub712/bauUU-UZsi12jkna10712'

Nicola
  • 181
  • 3
1

i would rather allow only Basic-Auth requests to be processed by a web-server on that URL in this case. So if Alice would try to guess, she would get a 404 at example.com/private But since you know the URL you directly could send auth header (i.e):

Authorization: Basic QWxhZGRpbjpvcGVuIHNlc2FtZQ==

along with your request at example.com/private and which will be accepted and proccessed

gPlusHub.com
  • 111
  • 1