21
2
I tried the following command:
$ wget -q --tries=10 --timeout=20 --spider http://google.com
(From this SO post. I want to check my internet connection in bash.)
I get following output:
Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:50-- http://google.com/
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting response... 302 Found
Location: http://www.google.de/?gfe_rd=cr&ei=k_IIVreaN-yH8Qfe1Yu4CA [following]
Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:50-- http://www.google.de/?gfe_rd=cr&ei=k_IIVreaN-yH8Qfe1Yu4CA
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.
Seems OK, however running the cmd with sudo
, I receive this:
Spider mode enabled. Check if remote file exists.
--2015-09-28 09:55:27-- http://google.com/
Resolving google.com (google.com)... failed: Name or service not known.
wget: unable to resolve host address ‘google.com’
I need this line in a script, which I call with sudo
and so it always fails.
Can somebody tell me the reason for this? How can I work around that?
Sorry, I actually wanted to create the questions in AskUbuntu. Not sure if On-Topic here... – h0ch5tr4355 – 2015-09-28T08:09:56.693
11It's on-topic here. – Deltik – 2015-09-28T08:33:32.923
4It would be Off-Topic there. – SnakeDoc – 2015-09-28T20:04:51.900
This looks like an XY problem. Executing a random
– David Richerby – 2015-10-18T12:32:19.827wget
against google.com doesn't seem to be a good way of checking that an internet connection is working: for example, you might be on a connection that allows HTTP connections to Google but forbids the things your script really wants to do; or Google might forbid wget access to their site. What is the actual problem you have, for which you think thatsudo wget blah
is a potential solution?