openssl s_client returns DNS error

2

When trying to connect to a site using openssl, I get the following error.

$> openssl s_client -connect www.citibank.com:443
getaddrinfo: No address associated with hostname
connect:errno=2

But a wget or curl works just fine. What is missing here?

Krishter

Posted 2016-06-11T15:03:55.683

Reputation: 123

Please check for the http_proxy and https_proxy environment variables. – Daniel B – 2016-06-11T15:19:15.393

@DanielB The proxies are perfectly fine because the wget and curl command work. Of course I do give the https://www.citibank.com for wget. Whereas I need to say www.citibank.com:443 for s_client. Would that make a difference?

– Krishter – 2016-06-11T15:23:00.940

Answers

2

So I gather you’re using proxy servers. OpenSSL doesn’t (can’t) use them though, so it doesn’t work.

When you use a proxy, your browser sends the whole URL (well, almost) to the proxy server:

GET http://www.citibank.com/ HTTP/1.1
...

That means your browser doesn’t have to resolve www.citibank.com locally. The proxy will do that.

It appears you’re located in a rather restricted environment. Otherwise, your DNS server would resolve external addresses even if you couldn’t directly connect to them.

Daniel B

Posted 2016-06-11T15:03:55.683

Reputation: 40 502

Thank you for the answer. I still find it confusing that the same DNS is used in both wget, curl and openssl. Then why is it that only the openssl operation is unable to get the name resolved? – Krishter – 2016-06-12T06:56:38.860

Like I said: wget and curl aren't talking to the DNS server at all. – Daniel B – 2016-06-12T11:15:55.493