10

When I run curl from the command line

curl "https://example.com"

It succeeds immediately, returning the result of the request.

When I run the same wget command

wget https://example.com

It eventually times out with "Unable to establish SSL connection." without any specific error message. It connects, but can't do the SSL handshake. I tried with --no-check-certificate but that made no difference - it appears to be timeout related.

However:

wget http://example.com

works fine (HTTP vs HTTPS).

This is also affecting the PHP's "file()" method call.

My question is, what would cause curl to succeed in retrieving a page (for all sites in our domain) but not wget or the php interpreter? This is a new issue over the weekend, the server was fine before.

(Operating system is Red Hat Enterprise Linux 6.4)

Diamond
  • 8,791
  • 3
  • 22
  • 37
Resorath
  • 313
  • 1
  • 3
  • 13

4 Answers4

7

This seems like an issue with choosing the SSL protocol. For some reason the server is picky about the protocol. Some clients happen to make the correct guess, others don't.

With wget, try eg. --secure-protocol=tlsv1 or --secure-protocol=sslv3. For more details, see GNU Wget man page.

With PHP, see this question on SO.

tuomassalo
  • 738
  • 2
  • 8
  • 22
  • I disabled SSLv3 yesterday for security issue with SSL and this morning all the cron jobs didn't start (`Unable to establish SSL connection`). The solution was to add `--secure-protocol=tlsv1_2` to the `wget` command. Thanks. – David Bélanger Jul 09 '15 at 14:39
4

On Red Hat Enterprise Linux 6.x and prior major versions, wget does not support Server Name Indication, which is required to correctly access a growing number of TLS/SSL secured web sites, quite possibly including yours. On the other hand, curl in RHEL 6 does support SNI.

Red Hat is aware of this issue and released a fix in RHEL 6.6. The issue is also fixed in RHEL 7.0.

Michael Hampton
  • 237,123
  • 42
  • 477
  • 940
0

On its surface, this certainly does not make a lot of sense. First step is to confirm that wget works at all for ssl (can you hit other ssl sites with wget?).

If not: Chase it as a wget issue

(Note, if you try four diff ssl sites w wget, and two of them break, it is still a wget issue)

Another diagnostic: What does a browser do to the https url? (chrome and ffox) -- do they give warnings or do they connect cleanly?

-1

wget latest (1.15) on Centos 6.5

I had to compile wget 1.15 then replace yum's wget

wget http://ftp.gnu.org/gnu/wget/wget-1.15.tar.gz
tar -zxvf wget-1.15.tar.gz
cd wget-1.15
./configure --prefix=/usr/local/bin --with-ssl=openssl
# replace wget, confirm
cp /usr/local/bin/wget /usr/bin/wget