I am trying to test if site/server/whatever has ability to pipeline HTTP request.
So far, I think it is possible to send several request through one curl
session, like:
curl http://www.example.com/1.html http://www.example.com/main.css http://www.example.com/2.html
Is possible to use below mentioned to simulate pipelining and how can I actually verify than pipelining was used?
I'm laso thinking about using netcat
, send few requests and verify if it'll came back in correct order (as requested by RFC)
Could you give me some hints about how to verify a content was pipelined? (At the best on linux, but is not necessary.)
I just checked with tcpdump, and it looks like the simple invocation of curl you're suggesting doesn't actually pipeline requests. The man page states they are fetched sequentially, which also suggests no pipelining.
When I've tested this in the past, I've just pasted the complete HTTP 1.1 requests into a telnet session.
To check, keep a tcpdump running in another window, eg "tcpdump -nli any -s0 -xX host 1.2.3.4 and port 80". If you try the curl commandline, you'll see that the first request is sent, a response is delivered, and then the second one is sent. If you are correctly pipelining, you should be able to see multiple GET lines being sent out before any response data comes back.
This feels a bit like a homework question, so I'm going to leave you there.