I have a box with Gigabit Ethernet, and I'm unable to get past about 200Mbps or 250Mbps in my download tests.
I do the tests like this:
% wget -6 -O /dev/null --progress=dot:mega http://proof.ovh.ca/files/1Gb.dat
--2013-07-25 12:32:08-- http://proof.ovh.ca/files/1Gb.dat
Resolving proof.ovh.ca (proof.ovh.ca)... 2607:5300:60:273a::1
Connecting to proof.ovh.ca (proof.ovh.ca)|2607:5300:60:273a::1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 125000000 (119M) [application/x-ns-proxy-autoconfig]
Saving to: ‘/dev/null’
0K ........ ........ ........ ........ ........ ........ 2% 5.63M 21s
3072K ........ ........ ........ ........ ........ ........ 5% 13.4M 14s
6144K ........ ........ ........ ........ ........ ........ 7% 15.8M 12s
9216K ........ ........ ........ ........ ........ ........ 10% 19.7M 10s
12288K ........ ........ ........ ........ ........ ........ 12% 18.1M 9s
15360K ........ ........ ........ ........ ........ ........ 15% 19.4M 8s
18432K ........ ........ ........ ........ ........ ........ 17% 20.1M 7s
With the constraint that I only control one server which I want to test, and not the sites against which I want to perform the tests, how do I do a fair test?
Basically, is there some tool that would let me download a 100MB file in several simultaneous TCP streams over HTTP?
Or download several files at once in one go?
Aria2 is command line tool similar to wget that supports multiple simultaneous downloads over http , bittorent, ftp etc.
Download file with 15 connections to /dev/null.
--allow-overwrite prevents aria from trying to rename /dev/nulll.
I prefer not to start allocating space before the download since it takes time for the download to start
You will be limited to less then the speed of the slowest link. You could have a 10Gig connection, but if your internet connection is Dialup, you are going to be waiting. Even on a LAN that can support 1GB end to end, you may see a bottlneck with the read speeds of the source server or the write speeds of the destination server.
There are many factors that contribute to this:
For one thing, you're downloading over the Internet. Let's assume you truly have a gigabit down connection at your disposal:
TCP overhead can eat anywhere from 5-10% of your bandwidth - for simplicity's sake let's say 10%. So you're down to 900Mbits/s.
Remote server load is a major factor and you can't control or see it. Many servers can easily pull 200 MB/s read times, but under load it can push the speeds down.
Routing is a factor in speed too. If your route is saturated, speed will suffer.
And finally ...do you really have a gigabit connection to the Internet, or is it just your port speed? Speeds are limited by the slowest link that you cross. Also, if you have a hosted server with a gigabit link, these are often shared by other clients and you don't get a dedicated gigabit link to begin with.
Edit: The reason I didn't recommend a tool is because they're a google search away and there's tons.