28

NoScript is a great plug-in, both for security and for ad blocking. However, I've found it's not always easy to figure out what scripts need to be permitted on certain pages, to be able to use the features I want while still blocking unnecessary scripts. Many times, it's not simply sufficient to turn on scripts for the domain of the site you're visiting.

For example, to use StackExchange to its fullest I've had to enable scripts for:

  • stackexchange.com
  • googleapis.com
  • sstatic.net

In the past, I've generally just figured this out through trial and error. However, that is most definitely not the way it should be done. It leaves me vulnerable to still possibly running malicious or advertising scripts during that trial and error phase, which could lead to irrecoverable damage.

Most of the time, this issue arises when I want to use a certain feature on a website but the script is hosted by a different domain. I usually start by enabling the "usual suspects" like domain.tld, domaincdn.tld, domain-images.tld. Still, this doesn't always work. And, slightly beside this matter, there's really no intuitive way (short of running a WHOIS query, and trusting those results) for me to be certain that domain-images.tld is owned and controlled by the same people running domain.tld, or that its scripts are actually offering any functionality that I want.

Is there an additional plug-in or other method which can be used for me to figure out which domains/scripts need to be white-listed for me to use certain features of a website? Preferably, the method should not require knowledge of any scripting languages or require the user to interpret the sites' source code.

My concept of an "ideal" solution would be a plugin that allows me to right-click any interactive page element (button, hyperlink, flash object, etc) and see a list of sites that host scripts required for that element to perform its function. It should also allow me to right-click an empty spot in the page, and see what domains host scripts that affect the layout and formatting of the page.

This question was IT Security Question of the Week.
Read the Sep 02, 2011 blog entry for more details or submit your own Question of the Week.

Iszi
  • 26,997
  • 18
  • 98
  • 163
  • 1
    By not wanting to validate scripts based on source code, what are you really achieving? I could accept that scripts running from a domain/URL which looks dodgy could be avoided, but otherwise I'm not sure there's a great deal of value there. Perhaps a better approach might be to whitelist all scripts on sites you 'trust', and utilise a sandbox environment for all other sites/content? (upvoted) – lew Jul 04 '11 at 06:47

3 Answers3

9

You could install a site advisor plugin like for example McAfee's site advisor. By doing this you can easier search the domain for any reported maliciousness.

For example I search a domain which a big newspaper is getting their adds from I get report back saying generally OK, but community reports say "Adware, spyware, or viruses (1)".

I'm sure there exists more and possibly better plugins for searching a domain for their rating aswell.

EDIT: I would like to add this snippet I found from NoScript's FAQ:

Starting with version 1.9.9.61, NoScript offers a "Site Info" page which can help you to assess the trustworthyness of the web sites shown in your NoScript menu. You can access this service by middle-clicking or shift-clicking the relevant menu item. If you're more on the technical side and you want to examine the JavaScript source code before allowing, you can help yourself with JSView.

So if you are curious if you should trust it or not just middle click the domain and it will query the following sites for information:

  • WOT Scorecard
  • McAfee SiteAdvisor®
  • Webmaster Tips Site Information
  • Safe Browsing Diagnostic on google-analytics.com
Chris Dale
  • 16,119
  • 10
  • 56
  • 97
  • 2
    The more I think about this, the more I want to just nuke my whitelist and start checking every site this way before I re-add them! – Iszi Jul 27 '11 at 16:18
  • @Iszi: I clean my whitelist from time to time (everytime it's too large for me to remember "why" & "when"). I don't believe in eternal truth and eternal trust. – dan Apr 05 '16 at 09:35
  • **WARNING:** WOT-ratings may be (partly) ok, but do **NOT install any WOT-software**, it can/must be considered as malware!!! - https://thehackernews.com/2016/11/web-of-trust-addon.html – DJCrashdummy Nov 11 '16 at 07:31
5

[Disclosure: I am the co-founder of the company whose product is discussed in this answer.]

In the past, I've generally just figured this out through trial and error. However, that is most definitely not the way it should be done. It leaves me vulnerable to still possibly running malicious or advertising scripts during that trial and error phase, which could lead to irrecoverable damage.

This is the problem with whitelist based security products. You really can't be sure about every item you add to the list. You just try it and hope for the best. Even if you whitelist a domain, new scripts can be added to that domain, or existing ones could be changed. To be completely sure, you would need to analyze each script before execution to look for malicious activity. I don't believe it is possible to have a generic script analysis program that can look at each script and determine if it is safe or not.

Is there an additional plug-in or other method which can be used for me to figure out which domains/scripts need to be whitelisted for me to use certain features of a website? Preferably, the method should not require knowledge of any scripting languages or require the user to interpret the sites' source code.

While not exactly what you are asking for, my company built a similar security plugin that solves your problem, but in a different way. We run all the scripts needed for a page, but we run them on a disposable cloud server. This results in the user getting full functionality of a website, without having to whitelist any scripts, and without having any scripts run on their local computer. This saves the user from needing to know scripting languages or requiring them to interpret the sites' source code. In essence, it doesn't matter if the scripts are good or bad, because by running them on our servers they can't affect your computer.

If you are interested, you can learn more on our website.

AviD
  • 72,138
  • 22
  • 136
  • 218
Zuly Gonzalez
  • 394
  • 3
  • 21
  • 2
    "_We run all the scripts needed for a page, but we run them on a disposable cloud server._" so your own servers to see the content of HTTPS pages? – curiousguy Jun 25 '12 at 22:58
  • @curiousguy Yes, our servers have the ability to see the content on HTTPS pages, but no human ever looks at it. Even still, we understand how that can be a little unnerving. To remedy this, we are working on a new version that moves the server VM onto the user's computer. – Zuly Gonzalez Jun 25 '12 at 23:18
  • 1
    It's a trade-off. For some applications it would be unacceptable: not only I would not allow anybody **to intercept connexions with my bank**, I also do **not need** this, because I trust the bank website not to try attack my web browser (I understand that the bank website _could_ be hacked, but I consider the risk _practically_ neglectible). OTOH, I just do _not _trust every HTTPS website, and I would **not** trust a website **just because it uses HTTPS**. – curiousguy Jun 25 '12 at 23:34
  • 3
    "_This saves the user from needing to know scripting languages or requiring them to interpret the sites' source code._" Actually white-listing a domain has nothing to do with carefully reviewing every JavaScript in this domain; it's about trusting this domain (and trusting the web browser to also protect from the website). – curiousguy Jun 25 '12 at 23:44
  • 4
    From your website [The Web Is A Very Dangerous Place](http://lightpointsecurity.com/products/lightpointweb/the-web-is-dangerous) "_The study also found that the .com domain was the second riskiest of all, behind only .cm (for Cameroon). Why is .cm the most dangerous? .cm sites are commonly used for “typo squatting”. The idea is that a user intending to go to facebook.com might accidentally type facebook.cm, causing them to land on a likely malicious site._" **I do not understand how going to the wrong website (say facebook.cm), and then login in, becomes a non-issue when using your product.** – curiousguy Jun 25 '12 at 23:49
  • 2
    (...) However, it should be noted that nothing in my previous comments imply in any way that I would believe that your product is _without_ merit. I am, however, **very concerned** about such bold statements as "_you don’t have to care if a site is malicious or not._", especially in the context of a discussion of "typo squatting", as **I believe such statements encourage carelessness**. – curiousguy Jun 25 '12 at 23:53
  • @curiousguy It's definitely a tradeoff. I, myself, don't use our product when logging in to my bank website. However, I do use it when I visit just about every other site online. Our product can easily be turned on and off, so one can choose which sites to visit with our protection, and which sites not to. Re: your whitelisting comment, what I was referring to was products like NoScript which let you whitelist specific scripts of a domain. The goal being that a user would enable ONLY the scripts necessary for basic site operation. – Zuly Gonzalez Jun 26 '12 at 00:41
  • @curiousguy _"I do not understand how going to the wrong website (say facebook.cm), **and then login in**, becomes a non-issue when using your product."_ That's because we don't make such a claim ;) See [What limitations does Light Point Web have?](http://bit.ly/NFagIl) in our FAQ. While we can't save the user from typing in credentials to facebook.cm, we do save them from malicious scripts that attempt to infect their computers just by mistakenly visiting the site. – Zuly Gonzalez Jun 26 '12 at 00:42
  • 2
    "_That's because we don't make such a claim ;)_" except when you say: "**you don’t have to care if a site is malicious or not.**" "_See What limitations does Light Point Web have? in our FAQ._" I see. (I guess you are not a teacher, or you would know that yo cannot "unsay" anything you have said, because some of the audience may have already "switched off".) When you people read "**Accidentally stumbling onto a malicious site by mistyping a website address or clicking a booby-trapped search result or shortened URL is no longer something to worry about.**" what are they supposed to understand? – curiousguy Jun 26 '12 at 01:30
  • 2
    (...) "_Even worse, some users will fool themselves into thinking they are safe when they are not._" Indeed. And the previous quoted text (bolded by me) from your website might actually encourage that foolishness. – curiousguy Jun 26 '12 at 01:31
  • 1
    (...) BTW, it is not just you. I was disturbed by that MS Windows TV commercial last year where a nice-looking but stupid girl clicks on phishing links, but IE safety feature blocks the phishing website, so nothing bad happens **so it's OK to click on random links in emails** according to the TV commercial. You should not encourage irresponsible behaviour, ever. You should not suggest that not going to the intended website is risk-less, ever. – curiousguy Jun 26 '12 at 01:40
  • 2
    "_Re: your whitelisting comment, what I was referring to was products like NoScript which let you whitelist specific scripts of a domain._" I understood that you were commenting about NoScript. White-listing a domain in NoScript has nothing to do with carefully reviewing every JavaScript in this domain; it's about trusting this domain (and trusting the web browser to also protect from the website). By using NoScript I can decide the trade-off between the advantage and the risk of JS and other embedded content (Flash, Java...) **for each site, instead of globally**. – curiousguy Jun 26 '12 at 01:51
  • 1
    @curiousguy Contrary to your statement, **we do not encourage irresponsible behavior.** When we say, _"**Accidentally** stumbling onto a malicious site by mistyping a website address or clicking a booby-trapped search result or shortened URL is no longer something to worry about."_ That's exactly what we mean. There's no need to read into it. Note the word _accidentally_. Our goal is to provide security while giving users the ability to use the Internet as it is intended, unlike NoScript which provides security at the cost of usability, and requires some knowledge to use effectively. – Zuly Gonzalez Jun 26 '12 at 02:25
  • 1
    However, we are getting off-topic here. This is not the place to debate the merits of our product (outside of comparing it to NoScript, which is the intention of the question and my answer). Feel free to email me and we can discuss this further. – Zuly Gonzalez Jun 26 '12 at 02:25
  • 2
    "_This is not the place to debate the merits of our product_" But **I was not doing that**. I was pointing out extremely disturbing **claims** on your website, not problems with the design of your product. This is different. (I'd like to know if others think this is really off-topic.) – curiousguy Jun 26 '12 at 02:30
  • 1
    With all respect, but I was shocked by this service. Under "privacy", it claims: "When you visit a website without Light Point Web, that website’s owner can learn information about you" yes, so instead you propose to disclose *everything* to your service! This is beyond facebook evil. – Jacco Jul 10 '12 at 12:02
2

I suspect that the "right" answer would be for there to be an extension to HTML in which the website itself declared which domains were under its direct control and which were third party scripts (e.g. Stack Exchange would declare stackexchange.com, sstatic.net as under direct control, googleapis.com as an essential third-party site, and others as advertising sites).

It might seem odd - after all, the whole point of NoScript is that you don't trust the site in the first place - but when you're permitting scripts from the site itself, you have decided to trust (for example) Stack Exchange and you just want to designate all the domains it trusts.

Obviously, a site could lie and list all of the ad sites it deals with as part of its internal structure, but NoScript's UI would have to let you know what you were doing.

Mind you, proposing extensions to HTML isn't exactly a practical solution to your problem!

Richard Gadsden
  • 501
  • 1
  • 4
  • 11