Why is cURL slower than wget

1

Test page: https://www.beobank.be/nl/Home.aspx

This is the result using wget (or a real browser):

# time wget https://www.beobank.be/nl/Home.aspx -O /dev/null
--2015-01-26 12:05:46--  https://www.beobank.be/nl/Home.aspx
Resolving www.beobank.be (www.beobank.be)... 62.213.211.94
Connecting to www.beobank.be (www.beobank.be)|62.213.211.94|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 33444 (33K) [text/html]
Saving to: `/dev/null'

100%[======================================================================================================================================================>] 33,444      --.-K/s   in 0.05s   

2015-01-26 12:05:47 (670 KB/s) - `/dev/null' saved [33444/33444]


real    0m1.327s
user    0m1.072s
sys     0m0.060s

And this is the result using curl:

# time curl https://www.beobank.be/nl/Home.aspx &>/dev/null

real    1m0.741s
user    0m0.012s
sys     0m0.012s

OS X doesn't seem to have that problem (cURL is fast). This only happen on Linux as far as I can test. All the machines (I've tried several), are all Debian based (running latest software) and have IPv6 enabled, but that website has no IPv6 records. All tests are a little over 1 minute -- which seems like something is timing out?

Request to Google is fast:

# time curl https://www.google.com/ &>/dev/null

real    0m0.550s
user    0m0.020s
sys     0m0.012s

Adding the "-4" param to cURL to force IPv4 doesn't seem to change anything.

What could be the cause?

Tuinslak

Posted 2015-01-26T11:09:55.173

Reputation: 113

I dont know actually, but I guess its because of the structure of curl compared to wget which wget is on the os by default and its simpler than curl while curl is more complex and used for more complex works ... – TechLife – 2015-01-26T11:15:43.497

4I tried these tests and got SSL errors, which could be a clue: maybe curl spends more time retrying in an attempt to resolve the error, though in my case wget also took about a minute and gave an error. curl's message was a bit more informative: curl: (35) Unknown SSL protocol error in connection to www.beobank.be:443. Because you redirected stderr you will have missed the errors - try again without the redirections. My tests were done on Ubuntu 14.10. – AFH – 2015-01-26T11:49:17.473

@TechLife That doesn't really make sense. There must be a reason it takes 1 minute before something happens and it continues. – Tuinslak – 2015-01-26T11:49:37.477

You would need to use wireshark or similar to find out more. – AFH – 2015-01-26T11:50:47.693

1Thanks @AFH I see it now. It indeed completely fails and doesn't fetch any data at all. Odd. – Tuinslak – 2015-01-26T11:52:02.757

The question IMO doesn't make much sense because you cannot surely separate server response and network delay time from pure curl/wget time. I've tested beobank.be/nl/Home.aspx right now and have 0.3 for curl and 1.3 for wget, – Putnik – 2017-12-07T11:08:45.893

Answers

0

Use tcpdump on port 53 UDP, in order to examine how the connection works when you are fetching a site by CURL and by wget in the second tab.

Usual reason is caused by conflict of ipv4/v6 that can be fixed by disabling ipv6 in sysctl, or adding single-request-reopen option to /etc/resolv.conf.

sebthesaviour

Posted 2015-01-26T11:09:55.173

Reputation: 11