3

From my other question Besides Plain-Text passwords, is there anything to worry about with sites that don't use SSL?

there are answers that mention all sorts of things that can happen if there is no SSL, including malicious attacks.

If there is such a risk of bad things happening, why is HTTPS not forced? I can understand why a site might not have it enabled, but wouldn't the browser automatically set it up? I was searching Google and it mentions that there are ways to enable it for browsers in not so easy ways i.e., Chrome was said that you should enter "X" into the URL to force HTTPS, but not sure why this isn't a normal setting?

XaolingBao
  • 897
  • 2
  • 9
  • 21
  • 2
    Browsers can't force sites that don't support SSL to communicate using SSL. The decision to use SSL is entirely on the site owner. – Lie Ryan Jan 10 '17 at 12:34
  • 1
    Well yes, a site has to enable SSL functionality, but I was asking for sites that DO HAVE SSL ENABLED, why isn't it forced? – XaolingBao Jan 10 '17 at 12:44
  • 4
    Two reasons, the HTTPS site is not necessarily the same website as the HTTP, some servers are configured to serve entirely different site in their HTTPS site (e.g. a blog for `http://example.com` and a shopping cart in `https://example.com`). Second is that many sites uses different URL as their HTTPS site (e.g. `http://www.example.com` vs `https://secure.example.com`). There are third party browser plugins like HTTPSEverywhere, they compile a database of sites that supports HTTPS and a database of how to rewrite those requests to HTTPS version of the site. – Lie Ryan Jan 10 '17 at 12:51
  • Hmm, thanks. The thing is, from what I gather from my other question, it seems this isn't a good idea though? It would make sense, from what the answers mentioned, that everything be HTTPS regardless, and that sites should conform both HTTP and HTTPS urls to be the same? – XaolingBao Jan 10 '17 at 12:54
  • 1
    while browsers don't do that now, i can tell you that it's slowly marching towards that on many fronts – dandavis Jan 10 '17 at 15:14
  • Mostly dupe http://security.stackexchange.com/questions/4369/why-is-https-not-the-default-protocol except in the years since 2011 the picture for SNI (and also OCSP stapling) has gotten better -- and Google last year started boosting https in search results, giving many webmasters a concrete incentive to implement it. – dave_thompson_085 Jan 11 '17 at 03:26

2 Answers2

3

Because that's not easily automatable

If there is such a risk of bad things happening, why is HTTPS not forced?

Because that can easily break the site. And then users will just abandon that browser and pick another one that does not break their sites.

I run the "HTTPS Everywhere" plugin in my browsers. And they try to do something similar. But even they have to use a massive 22k-file ruleset. (From what I understand the rules are partially generated and partially hand-tuned.)

And judging by these comments in the Amazon-Ruleset, it's not that easy to write working rules that stay working:

  <!--    Amazon has a history of breaking us soon after
        adding rulesets, so these are here to detect that.

(Test-URLs omitted)

StackzOfZtuff
  • 17,783
  • 1
  • 50
  • 86
  • Estimating < 5MB total for the 22k files. I think that is very reasonable in this day and age for what you are getting. – Shiv May 24 '17 at 04:28
1

Https is a protocol. That means both sides have to agree upon it. Even just this would make things a bit weird - what would the browser do, try and connect over https, then if that fails, fall back to http, and cache that decision for some amount of time? If something on your site changes, how do you communicate that to the browser to get it to refresh its cache? There's a series of practicalities here that would need to be worked out to make this actually happen in the real world, and they haven't been yet.

Beyond that, not every site that accepts an https connection actually works over https. As a simple example, the New York Times website listens on port 443, but if you change the url for an article to be https, it will just redirect you back. If the browser did its own https redirection automatically, this could very easily put you in an infinite redirect loop; at the very least, you would be waiting through two http requests for every page instead of just one.

There's usually more work to be done in making a website https-available than just setting up an https terminator (although even that can be a bit of work, as you need to load test it to ensure it can handle your entire traffic - I once broke a previously-nicely scalable website by skipping this step). One of the biggest issues is making sure that all the content you embed in a page is also requested over https, or else you'll get mixed content warnings, and the user will just see a broken page. This can be particularly annoying to do if you have user-generated content, because now you've got to do some database munging to fix things they've been adding, and be very careful to make sure you don't screw something up while doing that. And now you've got to use https on your CDN, and some still don't support that easily, or maybe you have to upgrade your plan to be able to use a custom certificate (since you had a CNAME from your domain to their servers). And now that breaks some scripts you have, because they're written in python 2.6, and that doesn't have SNI support.

The list goes on. The point is, there's actually a non-insignificant amount of work for many websites to transition fully to https, and if a browser tries to force them all the way all at once, there will almost certainly be broken things. And that sucks for users.

Xiong Chiamiov
  • 9,384
  • 2
  • 34
  • 76