1

Are there any tools or techniques that allow a scan of multiple sites for the use of tracking cookies, beacons, web bugs, or offsite javascript includes?

As far as I can tell, major web application scanners do not report on these attributes.

One can manually navigate to sites using Ghostery and Disconnect, but doing so for a large number of URLs would be time consuming.

Hypothetically one can obtain the Ghostery and Disconnect block lists, then spider a site and look for references. Still, it would be nice if there was a clean cut tool or technique.

Citizen
  • 378
  • 3
  • 16
Ben Walther
  • 111
  • 4
  • 1
    If you're still looking, try the program OWASP ZAP, it has most of the features you're looking for. – Lighty Jun 26 '14 at 09:26
  • 1
    Old question, but even more relevant today than a year ago, imo. This is not a scraper that scans pages for 3P inserts, but rather an active 3P cataloger/visualizer within Firefox: [Lightbeam](https://www.mozilla.org/en-US/lightbeam/). It is sometimes shocking to see what mainstream sites do when you're not looking. – zxq9 Oct 31 '15 at 07:00

0 Answers0