1

I was searching for a tool to capture http packets sent from a linux server to an external server. Normally I use iftop or iptraf with filters to see real time information and tcpdump to get verbose information. But what I need right now is some way to capture all the url requests to some log file. I know that I can configure a proxy to log all this information but this is impossible because our actual architecture. Do you know some tool that can get this information? Or do I need to make a script to process the information from tcpdump?

radius
  • 9,545
  • 23
  • 45
hdanniel
  • 4,253
  • 22
  • 25

3 Answers3

7

What you need is urlsnarf from the dsniff project. It will generate a log with all http request seen on one network interface. A very good tool !

radius
  • 9,545
  • 23
  • 45
  • Yes, that's what I was looking for. Is a little application, it can be used to log to a file in a standard format, and for real time analysis. Also it can work with tcpdump. Thanks. – hdanniel Jun 19 '09 at 23:38
2

Sounds like a job for Wireshark (formerly known as Ethereal).

Look at the HTTP protocol support and the display filters for it. You probably want a display filter of "http.request.uri".

crb
  • 7,928
  • 37
  • 53
  • Wireshark will definitely do what you are asking. Its easy to use and you can set it up so that you only capture http packets. – RascalKing Jun 19 '09 at 23:17
0

I'd normally suggest WebScarab, but it acts as an HTTP proxy which you said doesn't work in your situation...

You'll need something that can listen promiscuously and then analyze things at the application protocol level. Here's a thread with a Perl script that looks to do what you want by analyzing tcpdump output.

squillman
  • 37,618
  • 10
  • 90
  • 145