Possible to catch URLs in linux?

7

8

There is a great program for windows (URL Snooper: http://www.donationcoder.com/Software/Mouser/urlsnooper/index.html) that allows you to view all URLs being requested on the machine.

Does any such program exist in linux (preferably command line)

simonwjackson

Posted 2009-11-08T08:31:48.273

Reputation: 255

im not sure what is that you are trying to do here. – None – 2009-11-08T08:41:37.303

Actually, it seems that URL Snooper does not only apply to URLs being requested on the machine, but also to URLs hidden in the HTML source of some page, which are not necessarily requested yet. For the latter see also "How to download list of files from a file server?" at http://superuser.com/questions/47089/wget-download-list-of-files-on-file-server

– Arjan – 2009-11-08T13:00:32.750

Answers

12

It seems that URL Snooper does not only apply to URLs being requested on the machine, but also to URLs hidden in the HTML source of some page, which are not necessarily requested yet. For the latter see also "How to download list of files from a file server?" here at Super User. Or, in Firefox see menu Tools » Page Info » Media, or use add-ons like Video DownloadHelper or UnPlug. The following applies to seeing all URLs that are actually requested.

The command line ngrep could do it, but gives far more details then you'd probably want.

Like: it will not simply show you the URL as typed into the location bar of a browser, but the whole HTTP request. (So: the IP address as resolved by the browser before actually making the request, and then the HTTP request the browser sends to that IP address.) And: it will also show this for every image etcetera used in the resulting page.

You might need to install ngrep, like on a default installation of Ubuntu:

sudo apt-get install ngrep

To capture all HTTP GET requests to port 80:

sudo ngrep -W byline -qilw 'get' tcp dst port 80

Still, that would show you the whole request. (Try for yourself, if you're a Super User!) To limit that output some more to show only lines with ->, get or host:

sudo ngrep -W byline -qilw 'get' tcp dst port 80 \
  | grep -i " -> \|get\|host"

Or, to capture all requests to port 80, but ignore those with the Referer header set (as set when requesting embedded images etcetera, but also set when clicking a link in a web page, thus only showing requests that are typed into a browser's location bar directly, or are opened in a new window, or are opened from a bookmark or email):

sudo ngrep -W byline -qilwv 'referer' tcp dst port 80 \
  | grep -i " -> \|get\|host"

Also sniffer tools like Wireshark have command line options. And, just as an aside and far more more basic, tcpdump is installed on most Linux distributions:

sudo tcpdump -Alfq -s 1024 \
  'tcp dst port 80 and ip[2:2] > 40 and tcp[tcpflags] & tcp-push != 0' \
  | grep -i " > \|get\|host"

Arjan

Posted 2009-11-08T08:31:48.273

Reputation: 29 084

4

I can also recommend url-sniff by Pawel Pawilcz. It is a lightweight Perl script that wraps nicely around ngrep. It supports colorized output as well. Here you will find a screenshot. It gives you a simple interface for sniffing all requested URLs.

timn

Posted 2009-11-08T08:31:48.273

Reputation: 796

The url-sniff website is offline, but there is a copy in the archive.

– Stefan Schmidt – 2012-02-06T13:36:15.303

1

You could use an HTTP proxy such as Privoxy, but you'll have to configure your browser to use it - it doesn't snoop network traffic. It makes a log of URLs accessed which you can view with a text editor.

Hugh Allen

Posted 2009-11-08T08:31:48.273

Reputation: 8 620

0

  1. Run a proxy that logs the requests as suggested by navilon. You don't need to configure anything on client side if you go for a transparent proxy.
  2. Run a sniffer on the gateway.

geek

Posted 2009-11-08T08:31:48.273

Reputation: 7 120