2

I realise there are similar questions around this topic, however, I think this is sufficiently different/focused not to be a duplicate. I hope it doesn't sound like a soapbox piece.

HTTP/2 effectively mandates TLS, since there isn't a mainstream browser implementation supporting h2c.

Whilst the aims of the "SSL everywhere" movement seems reasonable, I'm unconvinced. I'm concerned that in practice, it will make the web less secure whilst creating the illusion of security.

In many parts of the western world, bandwidth exists in sufficient quantities for local caching to be overlooked as a concern, it certainly isn't universal - indeed, reliable connectivity is an issue in some locations, and a caching proxy is an appealing solution.

TLS proxies already exist that can be used to mitigate this, provided someone is willing to install a root CA to accept re-signed content. Some corporate desktops do this as part of a standard build. Forcing TLS on web users will encourage this practice.

There will also be cases where in lieu of installing a root CA, users will become accustomed to accepting self-signed or suspicious certificates, potentially to a level where it becomes automatic even for sites that really shouldn't have this issue.

There's a psychological impact to churning out the message that "this site is secure" - it predisposes users to think that if they can see a secure padlock/green tick/whatever then they don't need to concern themselves with what information they're sharing, and why - legitimate sites can be hacked, less reputable ones can get SSL certificates, and if there isn't at least one intelligence agency or organised crime cartel that has a copy of a real root CA cert, then I'm a teapot.

I'm hoping you can convince me I'm wrong; are there strong technical reasons for browsers mandating TLS for http2?

Afterthought Whilst I fully appreciate the concerns raised by the NSA and GCHQ getting caught spying on popular web-based email and social media platforms, it bemuses me that platforms run by multinational corporations are seen as more trustworthy - some, if not all, of them, are gathering intelligence - rebranded and justified by targetted advertising.

bfloriang
  • 205
  • 1
  • 6
Phil Lello
  • 1,122
  • 10
  • 15
  • deprecation of much worse plaintext. Besides that, it was close anyways for TLS to be mandatory for HTTP/2 and for some (probably non-browser) reason it was dropped. – SEJPM Mar 13 '16 at 21:12
  • Plaintext is absolutely fine when it's public content and are happy for it to be cached. – Phil Lello Mar 15 '16 at 22:40
  • For example, a website advising on sanitation if a natural disaster means access to safe drinking water is scarce. Privacy isn't a concern in that situation. – Phil Lello Mar 15 '16 at 22:54
  • But integrity (which TLS also guarantees) very much is a concern, for every website. People get attacked over insecure channels. Malicious JavaScript gets injected, ads get inserted, sometimes they get their browsers weaponized. These are all known attacks, and are trivial for any network (be it an ISP or a hacked router) to pull off. – Eric Mill Mar 16 '16 at 13:42
  • I recommend https://https.cio.gov/everything/ for some rationale on why the US government is moving to HTTPS for all websites (even the less sensitive ones). I also wrote this blog post about why it's going to be okay to leave HTTP behind: https://konklone.com/post/were-deprecating-http-and-its-going-to-be-okay – Eric Mill Mar 16 '16 at 13:44
  • @konklone integrity only requires signatures, and signed plaintext could be cached without sharing keys, and stop inappropriate header tampering (which should be regulated by legislation). Mandatory TLS opens up a new DoS vector via certificate revocation. Intelligence agencies aren't doing their job if sniffing TLS really presents a challenge. – Phil Lello Mar 16 '16 at 14:41
  • Signatures aren't handled at the protocol level, or automatically by browsers. You and your browser don't do signature checking on arbitrary websites, so anyone can inject JavaScript into arbitrary websites. Certificate revocation is not a meaningful DoS vector. All available evidence shows that intelligence agencies are very much affected by properly configured TLS. (TLS is, of course, often misconfigured.) – Eric Mill Mar 16 '16 at 17:09
  • I don't think this can get resolved via comments, so bowing out. There is a lot of content where intelligence agencies aren't a concern, and signature checking can be added to servers/browsers, just like http/2 and ssl/tls were. – Phil Lello Mar 16 '16 at 17:33

1 Answers1

1

There as some definitely technical reasons for having http2 only use tls. As a starting point:

  • Negotiation speed - without tls the upgrade negotiation is done as an additional http request so there are extra round trips. With tls this is done as part of the tls negotiation, from tls 1.3 this could be a single round trip.

  • Proxies - when using plain text many internet proxies can get in the way, badly configured proxies are likely to cause problems if they dont understand http2. With tls this is avoid, nothing between the server and client can interfere.

Maybe this is more I've not remembered right now. And this avoid talking about your more general points for wider tls usage.

  • Using tls everywhere is unlikely to result in more use of self-signed certificates, no self respecting site is going to use one, you are always going to lose customers and anyway services such as lets encrypt make it trivial to use a trusted certificate.

  • Caching, is already often limited to service controlled caches, the use of more and more tls is unlikely to have a large effect on that. As more and more sites make use of CDNs proxies have become less relevant.

mcfedr
  • 162
  • 4