0

Recently I started to use Debian 9 (9.4, from Debian 8.x) and a script involving curl stopped working. I connect to internet through a squid proxy on localhost connected to a parent proxy.

My environment variables are configured like this

root@server:~# printenv | grep -i proxy
HTTP_PROXY=http://127.0.0.1:3128
FTP_PROXY=http://127.0.0.1:3128
https_proxy=https://127.0.0.1:3128
http_proxy=http://127.0.0.1:3128
HTTPS_PROXY=https://127.0.0.1:3128
ftp_proxy=http://127.0.0.1:3128

When I use wget, it works:

root@server:~# wget https://www.google.com.cu
--2018-03-14 09:08:53--  https://www.google.com.cu/
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘index.html’

index.html                  [ <=>                          ]  11.12K  --.-KB/s    in 0.001s

2018-03-14 09:08:54 (14.9 MB/s) - ‘index.html’ saved [11389]

when I use curl, this is what I get

root@server:~# curl -v https://www.google.com.cu
* Rebuilt URL to: https://www.google.com.cu/
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to (nil) (127.0.0.1) port 3128 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection:     ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
* successfully set certificate verify locations:
*   CAfile: none
  CApath: /etc/ssl/certs
* TLSv1.2 (OUT), TLS header, Certificate Status (22):
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
* Curl_http_done: called premature == 0
* Closing connection 0
curl: (35) error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol

I know these two commands are not equivalent, this is just to illustrate the HTTPS transfer problem.

I need to use curl because the script uses a web API, so it needs to use POST instead of GET request, and to set some headers and data to the POST request. (api.dropboxapi.com is the target site)

This all used to work on Debian 8 without a hitch, and besides wget WORKS, only curl is failing with the debian version change. All the other HTTPS clients seem unaffected (FF, Chrome, Edge, wget all seems to work as always)

Does anyone knows about this problem? Is there any workaround, fix, command line option change or whatever for making debian 9's version of curl work?

Output of "curl -V"

root@server:~# curl -V
curl 7.52.1 (x86_64-pc-linux-gnu) libcurl/7.52.1 OpenSSL/1.0.2l zlib/1.2.8 libidn2/0.16 libpsl/0.17.0 (+libidn2/0.16) libssh2/1.7.0 nghttp2/1.18.1 librtmp/2.3
Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
Features: AsynchDNS IDN IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB SSL libz TLS-SRP HTTP2 UnixSockets HTTPS-proxy PSL
  • Your proxy server doesn't accept SSL connections from clients, so you shouldn't refer to it with `https`. – Michael Hampton Mar 14 '18 at 15:53
  • See the wget command above, it works. So I don't understand what you are trying to say. :-( – Yanko Hernández Álvarez Mar 14 '18 at 16:14
  • Besides, It used to work on debian 8. – Yanko Hernández Álvarez Mar 14 '18 at 16:15
  • That doesn't mean anything. – Michael Hampton Mar 14 '18 at 16:18
  • I don't understand.. Are you saying that curl is trying to connect to the squid proxy server using SSL/TLS directly (and squid of course not initiating the SSL handshake, so curl fails) instead of connecting "cleartext" (to squid) and then using a "CONNECT" request to the target HTTPS site? If this is what you are saying... well, that's a MAJOR functionality change from previous versions, bound to break ALL scripts that use curl behind a proxy, and I don't think debian is so careless to do it without fair warning (hence "It used to work on debian 8")... So. would you care to explain? – Yanko Hernández Álvarez Mar 14 '18 at 16:40
  • You have a much higher opinion of Debian than I think they deserve, having been familiar with Debian for its entire existence... But yes, that's exactly what is going on. Your proxy server does not accept SSL connections. How it worked before, I have no idea; I don't see the old configurations or the old software. – Michael Hampton Mar 14 '18 at 16:41
  • Is there any way to do it the old way (connecting "cleartext" (to squid) and then using a "CONNECT" request to the target HTTPS site)? any command line option? – Yanko Hernández Álvarez Mar 14 '18 at 16:46
  • As I said before, use `http` not `https` in the proxy server URL. – Michael Hampton Mar 14 '18 at 16:46
  • ahhh you are saying I must change https_proxy=https://127.0.0.1:3128 (and company ;-) to https_proxy=http://127.0.0.1:3128... Let me try that. Will get back to you as soon as I try. – Yanko Hernández Álvarez Mar 14 '18 at 16:49
  • Yes, that's what I mean, sorry if my previous comments were confusing. – Michael Hampton Mar 14 '18 at 16:49
  • Let me tell you something: you are the man!!!! That was it. Now I don't know how it works for wget and previous versions of curl... WOW. Thanks. Please, put the full answer, so I can accept it (the credit is all yours) :-) – Yanko Hernández Álvarez Mar 14 '18 at 16:55

1 Answers1

1

Many, many thanks to Michael Hampton (See comments). It turns out the problem was in the proxy configuration. It should say

https_proxy=http://127.0.0.1:3128
HTTPS_PROXY=http://127.0.0.1:3128

So curl was trying to connect to squid using TLS and failing of course.