5

My login page accepts the "return" parameter, which should contain URI to redirect user if he wasn't logged in and tried to access page which isn't accessible by unauthenticated users. It could be anything from my site.

How do I properly check that parameter in my server code, before I return HTTP 302 status with that URI to user after he successfully authenticated on the login page?

Is it enough to check that uri[0] == '/' && uri[1] != '/' (so malicious user won't send request like /login?return=http://google.com)?

Can a malicious user construct some tricky uri which will send victim user to trouble? Of course I assume that other pages do not contain XSS vulnerabilities, GET request do not change anything in the database, etc.

If a malicious user can trick the victim to click on his link and the victim user is already logged in, things will happen anyway and I can't do anything to prevent that except implement known security measures such as CSRF protection, etc.

SilverlightFox
  • 33,408
  • 6
  • 67
  • 178
vbezhenar
  • 257
  • 2
  • 6

8 Answers8

5

The best way is to create a whitelist of possible URLs, if the redirect is not part of the whitelist you can redirect to a default home page. However this isn't too flexible.

You can also regex the whole thing so it ensures the start of the URI is your website e.g. http://example.com/ and that what which comes after your last / only contains numbers, letters, & or ? and disallow any other characters (you might as well work with relative paths).

Lucas Kauffman
  • 54,169
  • 17
  • 112
  • 196
4

Depends on how fancy you want to get. I would imagine that this logic would fit the bill:

// Is this a full URL string?
if(is_valid_url(trim($uri)))
{
    // Redirect user to the default landing page
    redirect(default_url_within_site);
}
else
{
    // It is probably an actual URI string so just append it to your site name
    redirect('www.yoursite.com/'.$uri);
}
MonkeyZeus
  • 507
  • 3
  • 10
  • 1
    Your `else` case was basically what I was thinking. Even if an attacker tries to redirect the user to an alternate url with the return parameter, explicitly treating the return as a uri within your domain should prevent that from doing much beyond a 404 error. – Brian S Jan 06 '15 at 22:42
3

The vulnerability of not checking a parameter that you then redirect to is called an Open redirect. This is mainly a phishing risk, as a user may click a link going to www.google.com (say for this argument you are running Google) as they checked the URL before clicking. However, they are then redirected by the open redirect to www.evil.com where they don't notice the URL has changed because the page looks the same as Google and then they enter their Google username and password into the form, sending their credentials to www.evil.com.

Simply validate return to ensure it is a relative path only.

For example, in .NET you could call Uri.IsWellFormedUriString with UriKind set to Relative.

For other languages, similar methods may be available.

If you need to write your own the check you stated should be fine:

uri[0] == '/' && uri[1] != '/'

i.e. first character is / and second is not /.

RFC 3986 details the format of a relative URL:

 relative-ref  = relative-part [ "?" query ] [ "#" fragment ]

      relative-part = "//" authority path-abempty
                    / path-absolute
                    / path-noscheme
                    / path-empty

So as long as you are checking the first character is / and excluding the network path reference:

A relative reference that begins with two slash characters is termed a network-path reference

which is now used by modern browsers as a protocol relative URL, you are ensuring that the URL is relative and safe against open redirect vulnerabilities.

SilverlightFox
  • 33,408
  • 6
  • 67
  • 178
  • Thanks, I think I'll go that route. I'm aware about `//...` links and that's what I tried to prevent. I thought that there could be more types of dangerous link schemes that I'm not aware of, but it seems that there's not. – vbezhenar Jan 07 '15 at 20:38
1

treat ALL user input as EVIL, meaning. after sanitizing your input, reconstruct the redirect url to always point to your site (domain) and strip all arguments not in use for your system or that are 'special' in html (so strip encoded url's). Or better yet only work with "white listed" part of your site to return to, and other ways return to the "front". this way you can only go to a limited set of 'landing'-pages wich is muh easier to sanitize for.

LvB
  • 8,217
  • 1
  • 26
  • 43
1

You should check the whole parameter, not just the two first characters.

Even though a redirect (assuming you only use that parameter in the Location header) is not as exploitable as any other page in XSS, there has been known vulnerabilities in several browsers that could lead to a successful XSS through 30x redirects in both Opera and Firefox.

I would recommend using regex to allow only certain "safe" characters on that parameter, like upercase, lowercase, numbers, dot and maybe slash.

NuTTyX
  • 693
  • 4
  • 9
1

Tricky idea take all result as good but before redirect with curl send a request with following option enable and check if result Host is your Host

Danile
  • 11
  • 1
1

As far as I am aware this is called an open redirect vulnerability:

https://www.owasp.org/index.php/Open_redirect

It can be easily avoided by encrypting the redirect parameter with something like AES then url/base 64 encoding it.

Petah
  • 119
  • 5
0

The risk isn't high. The adversary has to be able to sender the user to a URL with a return parameter of their choosing. If the adversary was able to do that, they could most likely have send the user to their own domain in the first place instead of your login page.

It might be possible to abuse it in certain phishing attacks. But most phishing attacks would simply copy the design of your login page on a URL on their own domain.

That said, defense in depth is usually a good idea. So it is still worth considering how the return parameter can be validated.

Others have pointed out that you can create a whitelist of permitted URLs. A different approach would be to apply a signature or message authentication code to the URL.

If http://example.com/private requires authentication it could redirect to http://example.com/login?return=http://example.com/private&mac=Tej3M9aawsaCihGGXBGABPyyYsQQPX1o3PQeGg. The login page can check that the MAC is valid for the specified return URL, and not redirect to the return URL if it doesn't have a valid MAC. It may be a good idea to have the MAC depend not only on the return URL but also a cookie.

kasperd
  • 5,402
  • 1
  • 19
  • 38
  • I wouldn't say the risk isn't high. This clearly is a `Open Redirect` case, and can hurt SEO rankings if someone used it to "hide" some bad URLs. I hope you wouldn't mind non-personal downvote. – AKS Jan 07 '15 at 02:06
  • @AyeshK You are forgetting that the login page would only redirect upon receiving a POST request with a valid user name and password. That means it is not at all open. I don't think any legitimate crawler would POST to a login form, thus it could not even know if the return parameter means anything since the server hasn't looked on it yet. – kasperd Jan 07 '15 at 07:33