I was searching for a tool to capture http packets sent from a linux server to an external server. Normally I use iftop or iptraf with filters to see real time information and tcpdump to get verbose information. But what I need right now is some way to capture all the url requests to some log file. I know that I can configure a proxy to log all this information but this is impossible because our actual architecture. Do you know some tool that can get this information? Or do I need to make a script to process the information from tcpdump?
What you need is urlsnarf from the dsniff project. It will generate a log with all http request seen on one network interface. A very good tool !
Sounds like a job for Wireshark (formerly known as Ethereal).
Look at the HTTP protocol support and the display filters for it. You probably want a display filter of "http.request.uri".
I'd normally suggest WebScarab, but it acts as an HTTP proxy which you said doesn't work in your situation...
You'll need something that can listen promiscuously and then analyze things at the application protocol level. Here's a thread with a Perl script that looks to do what you want by analyzing tcpdump output.