2

This question is similar to Is it safe to whitelist CDN domains?, but is focused on the user's perspective.

It seems common in American business websites to use a CDN that presents partially random subdomains to load Javascript for core functionality. The most common example I find is <somehash>.cloudfront.net.

Users with a domain name-based blocker like NoScript have a usability/security problem here: They need to load the JS from the CDN domain, but the domain name isn't memorably or deterministically (from the user's perspective) linked to the site. So it's difficult for them to verify whether or not that subdomain is providing trustworthy JS for that site, especially if its source is subject to change.

So what's a good habit for a user here? Trust on first use? When the hashed subdomain changes, what do they do then? What are the significant risks here?

bright-star
  • 147
  • 5
  • 1
    From what I can tell, the is just the Amazon assigned host name for your CDN. Seems to me as though that is just laziness as you can easily replace it with a custom DNS entry. But I'm not an AWS expert. – Julian Knight Dec 05 '16 at 21:21

1 Answers1

0

Ideally, sites will use subresource integrity mechanisms, as described by the w3c

This will allow you (well, ideally your browser on your behalf) to verify that you are getting the file you expect from the content delivery network, by verifying that the hash matches the expected one, as defined in the main resource.

If they do not, you are probably out of luck - your 'trust the first time, investigate if it fails to match the next time' is probably as good as you get - that's why they created this standard.

crovers
  • 6,311
  • 1
  • 19
  • 29