Why does wget give an error when executed with sudo, but works fine without?

21

2

I tried the following command:

$ wget -q --tries=10 --timeout=20 --spider http://google.com

(From this SO post. I want to check my internet connection in bash.)

I get following output:

Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:50--  http://google.com/
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting response... 302 Found
Location: http://www.google.de/?gfe_rd=cr&ei=k_IIVreaN-yH8Qfe1Yu4CA [following]
Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:50--  http://www.google.de/?gfe_rd=cr&ei=k_IIVreaN-yH8Qfe1Yu4CA
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.

Seems OK, however running the cmd with sudo, I receive this:

Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:27--  http://google.com/
Resolving google.com (google.com)... failed: Name or service not known.
wget: unable to resolve host address ‘google.com’

I need this line in a script, which I call with sudo and so it always fails.

Can somebody tell me the reason for this? How can I work around that?

h0ch5tr4355

Posted 2015-09-28T08:04:07.677

Reputation: 883

Sorry, I actually wanted to create the questions in AskUbuntu. Not sure if On-Topic here... – h0ch5tr4355 – 2015-09-28T08:09:56.693

11It's on-topic here. – Deltik – 2015-09-28T08:33:32.923

4It would be Off-Topic there. – SnakeDoc – 2015-09-28T20:04:51.900

This looks like an XY problem. Executing a random wget against google.com doesn't seem to be a good way of checking that an internet connection is working: for example, you might be on a connection that allows HTTP connections to Google but forbids the things your script really wants to do; or Google might forbid wget access to their site. What is the actual problem you have, for which you think that sudo wget blah is a potential solution?

– David Richerby – 2015-10-18T12:32:19.827

Answers

39

You have a proxy defined in your environment. Yours appears to be 127.0.0.1:3128.

When you run sudo, the proxy environment variable isn't passed, which is why you can't directly resolve google.com.

You can see what proxy/proxies you have defined in your environment variables with this command:

env | grep proxy

Additional information on Ask Ubuntu

Note: If you want sudo to pass the HTTP proxy environment variable, try this:

sudo http_proxy="$http_proxy" wget -q --tries=10 --timeout=20 --spider http://google.com

You can also pass all environment variables using sudo -E:

sudo -E wget -q --tries=10 --timeout=20 --spider http://google.com

Stack Overflow has other options for keeping the environment variable when sudoing.

Deltik

Posted 2015-09-28T08:04:07.677

Reputation: 16 807

5OK, thank you very much for not only posting the answer, but also posting the links for explanation. Worked for me perfectly. – h0ch5tr4355 – 2015-09-28T08:57:11.207

7You can also use sudo -E to preserve environment variables – Squidly – 2015-09-28T10:45:07.303

4for only passing the http_proxy, wouldn't sudo http_proxy=$http_proxy wget ... be better? If you use that in any script you don't have to change it if the proxy changes. – Josef says Reinstate Monica – 2015-09-28T12:10:34.270

1Nice one, @Josef. I've updated the answer with your suggestion. – Deltik – 2015-09-28T12:13:00.927

1You can also add Defaults env_keep += "http_proxy ftp_proxy" to /etc/sudoers to make sudo automatically preserve those env variables. – Francois – 2015-09-29T12:12:14.890