I have a box with Gigabit Ethernet, and I'm unable to get past about 200Mbps or 250Mbps in my download tests.
I do the tests like this:
% wget -6 -O /dev/null --progress=dot:mega http://proof.ovh.ca/files/1Gb.dat
--2013-07-25 12:32:08-- http://proof.ovh.ca/files/1Gb.dat
Resolving proof.ovh.ca (proof.ovh.ca)... 2607:5300:60:273a::1
Connecting to proof.ovh.ca (proof.ovh.ca)|2607:5300:60:273a::1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 125000000 (119M) [application/x-ns-proxy-autoconfig]
Saving to: ‘/dev/null’
0K ........ ........ ........ ........ ........ ........ 2% 5.63M 21s
3072K ........ ........ ........ ........ ........ ........ 5% 13.4M 14s
6144K ........ ........ ........ ........ ........ ........ 7% 15.8M 12s
9216K ........ ........ ........ ........ ........ ........ 10% 19.7M 10s
12288K ........ ........ ........ ........ ........ ........ 12% 18.1M 9s
15360K ........ ........ ........ ........ ........ ........ 15% 19.4M 8s
18432K ........ ........ ........ ........ ........ ........ 17% 20.1M 7s
With the constraint that I only control one server which I want to test, and not the sites against which I want to perform the tests, how do I do a fair test?
Basically, is there some tool that would let me download a 100MB file in several simultaneous TCP streams over HTTP?
Or download several files at once in one go?