Wget HEAD request?

54

9

I'd like to send the HTTP HEAD request using wget. Is it possible?

Xiè Jìléi

Posted 2010-10-07T15:43:02.580

Reputation: 14 766

Answers

59

It's not wget, but you can do that quite easily by using curl.

curl -I http://www.superuser.com/

Produces this output:

HTTP/1.1 301 Moved Permanently                        
Content-Length: 144                       
Content-Type: text/html; charset=UTF-8     
Location: http://superuser.com/
Date: Sat, 09 Oct 2010 19:11:50 GMT

SleighBoy

Posted 2010-10-07T15:43:02.580

Reputation: 2 066

1-I is equivalent to --head. – Nicolas Marchildon – 2015-12-10T20:48:11.963

This is exactly what I want. – Xiè Jìléi – 2010-10-11T15:23:50.523

1If you need self-signed certificate based https, you can also add -k or --insecure – Mike Aski – 2019-02-06T17:31:33.377

37

Try:

wget -S --spider www.example.com

You can also pass -O /dev/null to prevent wget from writing HTTP response to a file.

Casual Coder

Posted 2010-10-07T15:43:02.580

Reputation: 3 614

2-S will show headers, but it executes a GET, not a HEAD. In other words, it will fetch the entire URL. – Dan Dascalescu – 2014-02-28T23:21:52.510

10wget -S --spider http://localhost log created in apache server is 127.0.0.1 - - [04/Mar/2014:15:36:32 +0100] "HEAD / HTTP/1.1" 200 314 "-" "Wget/1.13.4 (linux-gnu)" – Casual Coder – 2014-03-04T14:38:26.707

20

There isn't any need for curl.

With Wget, adding --spider implies that you want to send a HEAD request (as opposed to GET or POST).

This is a great minimalistic way of checking if a URL responds or not. You can for example use this in scripted checks, and the HEAD operation will make sure you do not put any load on neither the network nor the target webserver.

Bonus information: If Wget gets an HTTP error 500 from the server when it performs the HEAD it will then move on to perform a GET against the same URL. I don't know the reasoning for this design. This is the reason why you may see both a HEAD and a GET request being performed against the server. If nothing is wrong then only a HEAD request is performed. You can disable this functionality with the --tries option to limit Wget to only one attempt.

All in all, I recommend this for testing if an URL is responding:

# This works in Bash and derivatives
wget_output=$(wget --spider --tries 1 $URL  2>&1)
wget_exit_code=$?

if [ $wget_exit_code -ne 0 ]; then
    # Something went wrong
    echo "$URL is not responding"
    echo "Output from wget: "
    echo "$wget_output"
else
    echo "Check succeeded: $URL is responding"
fi

thisisfun

Posted 2010-10-07T15:43:02.580

Reputation: 443

4

wget -S gets file:

Content-Length: 2316, Length: 2316 (2.3K) [text/plain], Saving to: `index.html'

wget --spider gets headers:

Spider mode enabled. Check if remote file exists., Length: unspecified [text/plain] Remote file exists.

LanPartacz

Posted 2010-10-07T15:43:02.580

Reputation: 41

-1

Though not wget, many perl installs with lwp module will have a HEAD command installed.

Rich Homolka

Posted 2010-10-07T15:43:02.580

Reputation: 27 121