How can I make my browser ignore `<noscript>` tag content even when javascript is disabled?

2

1

Being a bit over conscious, My browsers have a whitelist for javascript they can run. But <noscript> tag content can give away this preference of mine, because the browser then attempts to load those pesky analytics 1x1 images.

Is there a way to make a browser ignore[1] this content even when javascript is disabled?

I do not mention which browser because I can swap open source browsers if needed. Chromium, Firefox, Opera, etc. So answers that address a single browser are fine.


[1] By "ignore" I mean don't make requests for bait content that will give away javascript-disabled status. I can put up a userscript that deletes <noscript> from the DOM, but by then it might be too late.

Mindwin

Posted 2017-01-20T11:49:01.610

Reputation: 175

1Short of that you won't have an option. You could of course try to use actual NoScript or AdBlockers which usually also block those things. If you do want to change the behavior of a browser on such a fundamental level you will probably have to roll your own. – Seth – 2017-01-20T13:04:39.670

@Seth browsers are too permissive (promiscuous?). They do (lots of) things behind your back that you'd rather not have allowed if you knew what it was. But Stallman had warned us several times already.

– Mindwin – 2017-01-20T13:16:16.493

@Seth ABP works but I have to blacklist stuff one by one. NoScript does not much for <noscript> content unless it contains another threat. RequestPolicy (firefox) can block cross-site requests. One can hedge the problem to a minimum, but it is still there. – Mindwin – 2017-01-20T13:20:42.607

Answers

1

You mean nosy code like:

<noscript>
    <img height="1" width="1" alt="" style="display:none"
     src="https://www.facebook.com/tr?id=redacted&ev=PageView&noscript=1">
</noscript>

??

I don't know of way to turn off evaluation of <noscript>, but in the above example, you could write a short browser extension to cancel any request with a url containing the substring facebook.com/tr?.

See chrome.webRequest for a description of the Chrome browser's API for watching, modifying, or blocking requests in flight.

Here is a working example:

manifest.json:

{
"name": "Website Blocker",
"description": "Keeps the browser from fetching tracking URLs",
"version": "1.0",
"manifest_version": 2,
"permissions": [
    "webRequest",
    "webRequestBlocking",
    "*://*.facebook.com/tr?*"
],
"background": {
    "scripts": [
        "script.js"
    ],
    "persistent": true
},
"icons": {
    "256": "world-blocker.png"
},
"converted_from_user_script": true
}

script.js:

"use strict"

console.log("Website Blocker is running!");
var re = new RegExp('https?://.*?\.?facebook.com/tr\?', 'i');

function checkUrl(details) {
    var cancel = !!details.url.match(re);
    if (!cancel)
        console.log(`Passing ${details.url}`);
    return {cancel: cancel};
}

chrome.webRequest.onBeforeRequest.addListener(
    checkUrl,
    {urls: ['*://*.facebook.com/tr?*']},
    ['blocking']
);

Use whatever you want for the icon. Or delete it and let the browser supply a dummy one. Its only role is to highlight the extension's entry on the chrome://extensions page.

The extension worked fine on the page in question. The page's console log has an error message net::ERR_BLOCKED_BY_CLIENT for the image fetch. Likewise, a red failure line in the Network tab.

Also, the URL pre-filtering is working well. The JavaScript is only getting called for the specific tracking URL. Checking the extension's console, I see no "Passing ..." messages, even when I visit Facebook.

George

Posted 2017-01-20T11:49:01.610

Reputation: 111

0

I'm not sure where your actual concern is with the information of whenever or not your browser has JavaScript enabled or not. You're correct that certain browsers are doing more than some would like them to be. The article you linked in one of your comments is, in my opinion, about a different topic than your question though.

As far as the stated scenario in your question is concerned, it's the normal operation of a browser to request resources that are part of page. A browser is unaware of the actual data that is being loaded from a URL just by looking at it and as such would not able to detect a one by one tracking pixel before actually loading it. This is also nothing your browser does "behind your back" and you're probably more concerned about the third party that delivered that graphic than your browser.

The solutions I can see in this matter (right now) are the following:

  • Block all Request from foreign domains. You already seem to be aware of a tool-set that could do that. You could e.g use RequestPolicy and set the scope accordingly. This won't protect you from tracking pixels that are delivered from the same domain you're visiting though.
  • Use something like a GreaseMonkey script or similar to modify the DOM and remove the noscript part. Depending on when this script is loaded it might be too late and your browser might try to pre-cache it as well. The following question could be helpful with this: how to run greasemonkey script before the page content is displayed?
  • Use a text browser.
  • Fork your favorite browser and modify it not to evaluate the noscript part. This would probably the most secure option but also the most complex.

On certain sites the noscript tag could contain valuable information for you. As you would not be able to discern the nature of that content without looking it, potentially loading additional resources, there doesn't seem to be a good way to identify "bad" content here. Except with some really subjective rules.

Further more if you're concerned about tracking, rather than the information on whenever or not you have JavaScript enabled, there are a lot more technologies that could expose you. Like the `canvas´ element, Flash and cookies. So the question would be what your real goal is by ignoring that section of a page when a lot of the other components might be just as dangerous.

Seth

Posted 2017-01-20T11:49:01.610

Reputation: 7 657

0

If you use uBlock Origin, the filter

##noscript

should work, although I haven't tested it. uBlock origin filters run before the page is evaluated, so this should prevent those elements from even being fetched.

Gavin S. Yancey

Posted 2017-01-20T11:49:01.610

Reputation: 198

I just tested this. It prevents the content in a <noscript> tag from being displayed, however, all images and iframe contents are still downloaded. – jdgregson – 2018-12-31T22:14:22.333