Native alternative to wget in Windows PowerShell?

293

93

I know I can download and install the aformentioned library (wget for Windows), but my question is this:

In Windows PowerShell, is there a native alternative to wget?

I need wget simply to retrieve a file from a given URL with HTTP GET. For instance:

wget http://www.google.com/

jsalonen

Posted 2011-11-28T09:56:40.167

Reputation: 7 143

Answers

243

Here's a simple PS 3.0 and later one-liner that works and doesn't involve much PS barf:

wget http://blog.stackexchange.com/ -OutFile out.html

Note that:

  • wget is an alias for Invoke-WebRequest
  • Invoke-WebRequest returns a HtmlWebResponseObject, which contains a lot of useful HTML parsing properties such as Links, Images, Forms, InputFields, etc., but in this case we're just using the raw Content
  • The file contents are stored in memory before writing to disk, making this approach unsuitable for downloading large files
  • On Windows Server Core installations, you'll need to write this as

    wget http://blog.stackexchange.com/ -UseBasicParsing -OutFile out.html
    
  • Prior to Sep 20 2014, I suggested

    (wget http://blog.stackexchange.com/).Content >out.html
    

    as an answer.  However, this doesn't work in all cases, as the > operator (which is an alias for Out-File) converts the input to Unicode.

If you are using Windows 7, you will need to install version 4 or newer of the Windows Management Framework.

You may find that doing a $ProgressPreference = "silentlyContinue" before Invoke-WebRequest will significantly improve download speed with large files; this variable controls whether the progress UI is rendered.

Warren Rumak

Posted 2011-11-28T09:56:40.167

Reputation: 2 637

17Fair warning: This method will put the entire content of the file into memory before writing it out to the file. This is not a good solution for downloading large files. – im_nullable – 2014-07-13T06:35:31.883

2@im_nullable, good call -- I've added that to the post. – Warren Rumak – 2014-09-18T15:47:18.970

@dezza, what do you mean by "encoding"? The output is a capture of the content sent in the body of the HTTP GET response, be it binary files, HTML, or whatever else you ask for. It's really hard to see how a .py file can "break" by copying its raw contents from one place to another, unless the web server is the one messing with it first..... – Warren Rumak – 2014-09-20T18:39:42.887

@Warren Try https://bootstrap.pypa.io/get-pip.py with above command and run it with Python 2.7, then try another tool like wget and it will work fine.

– dezza – 2014-09-20T19:35:04.567

1@dezza I've updated the answer with a different approach. Try it again. – Warren Rumak – 2014-09-20T20:06:57.757

@Warren +1 :) better and shorter. – dezza – 2014-09-20T20:07:56.930

p.s. To save some typing, you can type -o then hit tab to get tab completion for -OutFile – Warren Rumak – 2014-09-20T20:08:53.947

Thanks. Do you by any chance know how to clean copy/clipboard from PS console without block-selecting ? It's irritating copying from PS because almost every long command wraps on the next line. – dezza – 2014-09-20T20:11:29.827

You can redirect the output of powershell to the clipboard by doing something like "Get-PSDrive | clip". Just be aware that it will rewrite the output as Unicode. – Warren Rumak – 2014-09-22T16:39:57.090

Also, the console in the Powershell ISE doesn't share the console's selection logic. I suggest using that instead of the standard PS console. – Warren Rumak – 2014-09-22T16:42:53.450

This doesn't work if you're behind an authenticating firewall. You get an error "Proxy authentication required". You can fix this by running $wc = New-Object Net.WebClient; $wc.UseDefaultCredentials = $true; $wc.Proxy.Credentials = $wc.Credentials. You only need to do this once per session, it seems. (I'm not 100% sure why this works, it looks like the proxy is a session-level shared object...) – Paul Moore – 2014-12-15T15:33:10.257

This uses IE, like everything in Powershell it's been done in a quick and dirty way, instead of just integrating wget or curl. But obviously if Microsoft did that it would ruin their licencing. – Chris S – 2016-04-25T13:08:09.010

@ChrisS It doesn't use IE if you provide the -UseBasicParsing parameter. (That's why this parameter is required for Server Core) – Warren Rumak – 2016-05-12T01:12:26.247

converts the input to Unicode: instead of > out.html you can use | Set-Content out.html -Encoding Byte, while not helpful for iwr since it has -OutFile now, it's good to know when writing binary data to files in other cmdlets – Hashbrown – 2019-09-13T06:37:22.623

4This is now the correct answer, and I ran into wget accidentally testing if I had the actual wget installed. Annoying that it can't get the filename easily (you have to specify it in the output redirection), but this option has a better UI than the real wget (in my opinion) so there's that. – Matthew Scharley – 2014-01-14T00:52:15.130

13

But Windows 7 only comes with PowerShell 2.0, and the result will be "The term 'Invoke-WebRequest' is not recognized as the name of a cmdlet, ...".

– Peter Mortensen – 2014-06-06T17:51:35.800

Powershell 4 is available for Windows 7 -- it's part of the Windows Management Framework. http://www.microsoft.com/en-us/download/details.aspx?id=40855

– Warren Rumak – 2014-06-06T18:26:44.957

186

If you just need to retrieve a file, you can use the DownloadFile method of the WebClient object:

$client = New-Object System.Net.WebClient
$client.DownloadFile($url, $path)

Where $url is a string representing the file's URL, and $path is representing the local path the file will be saved to.

Note that $path must include the file name; it can't just be a directory.

Traveling Tech Guy

Posted 2011-11-28T09:56:40.167

Reputation: 8 743

Why does this use 100% of one of my CPUs? – Hut8 – 2015-10-26T19:11:38.380

@jsalonen and since that's .NET, it works on PS 2.0, which I am restricted to at them moment. – Nick – 2016-10-13T17:27:52.893

4For just getting a url and ignoring the results (e.g., part of an IIS warmup script) use DownloadData: (new-object System.Net.WebClient).DownloadData($url) | Out-Null – BurnsBA – 2017-05-02T18:35:05.967

Error messages are very unhelpful; if $path is a directory or existing file, it throws a generic Exception. Ah, Microsoft. – BaseZen – 2018-03-30T14:24:54.993

Nice! Invoke-WebRequest blows up in my packer job, complaining about "Out of Memory". The WebClient.DownloadFile works a treat. – Ian Ellis – 2019-02-09T17:32:03.677

33So far this has been the best solution proposed. Also given that it seems I can rewrite it in one line format as (new-object System.Net.WebClient).DownloadFile( '$url, $path) it is the best correspondence for wget I have seen so far. Thanks! – jsalonen – 2011-11-28T10:49:59.263

3As a side-note you can also do this asynchronously using something like (new-object System.Net.WebClient).DownloadFileAsync(url,filePath) – James – 2013-04-23T08:49:57.570

Can we fetch a particular text via Webclient and outout to a notepad ? thanks – Mowgli – 2013-06-18T16:11:53.427

6

Yes, this works out of the box on Windows 7 (that comes with PowerShell 2.0). Sample: $client.DownloadFile( "http://blog.stackexchange.com/", "c:/temp2/_Download.html")

– Peter Mortensen – 2014-06-06T17:57:09.023

88

There is Invoke-WebRequest in the upcoming PowerShell version 3:

Invoke-WebRequest http://www.google.com/ -OutFile c:\google.html

user4514

Posted 2011-11-28T09:56:40.167

Reputation: 1 172

1@gWaldo PowerShell can be quite slick.

You can use shortcuts

(iwr http://www.google.com/).Content > google.html

or use arguments

Invoke-WebRequest -Uri "http://www.google.com" -OutFile google.html.

It's important to realize that in PowerShell the pipe is an object pipe, not just a character pipe so the output of Invoke-WebRequest isn't a stream of the file but rather an object where you'll need to use .Content. Try this:

$foo = Invoke-WebRequest http://www.google.com then $foo | Get-Member then $foo.StatusCode or $foo.Content. – Tyler Szabo – 2017-03-08T00:20:41.013

1On Windows 2016 Core / Standard I had to pass -usebasicparsing as otherwise it was complaining about missing internet explorer engine – Adi Roiban – 2017-07-30T04:23:52.423

9all the elegance of dd... – gWaldo – 2012-08-31T15:29:01.757

1@gWaldo you are kidding–this is a joy to use (speaking as someone just learning PS) – None – 2012-10-16T20:41:05.067

8I just mean that the -Outfile parameter seems extraneous when you could just use > (to overwrite) or >> (to append) to a file. – gWaldo – 2012-10-17T13:12:45.520

5@gWaldo or even deduce the filename from the URL just like wget does :) – Peltier – 2013-07-17T10:29:16.077

Invoke-WebRequest $url ($url -split "/")[-1]. Unfortunately fails if the URL ends with a slash. Should be pretty straightforward to improve. – Peltier – 2013-07-22T14:24:37.457

5And as of PS 4.0, wget and curl are aliasted to Invoke-WebRequest (iwr) by default :D – Bob – 2014-03-25T16:12:35.013

@Bob Thx, Feel free to edit the answer and include these aliases! – user4514 – 2014-03-26T18:41:06.883

18

It's a bit messy but there is this blog post which gives you instructions for downloading files.

Alternatively (and this is one I'd recommend) you can use BITS:

Import-Module BitsTransfer
Start-BitsTransfer -source "http://urlToDownload"

It will show progress and will download the file to the current directory.

Matthew Steeples

Posted 2011-11-28T09:56:40.167

Reputation: 2 130

3BITS relies on support at the server end, if available this works in the background and you can get progress updates with other cmdlets. – Richard – 2011-11-28T10:42:06.267

2

I tried to fetch http://www.google.com/, but all I get is Start-BitsTransfer : Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)). I'm puzzled :|

– jsalonen – 2011-11-28T10:45:22.593

1@jsalonen I think that BITS will only download files rather than pages. As Richard says it relies on some server side support (although I don't think it's Microsoft specific). – Matthew Steeples – 2011-11-28T11:09:21.580

I see and I think I get the point in using BITS, however, its not what I'm looking for in here. – jsalonen – 2011-11-28T11:23:45.577

6

PowerShell V4 One-liner:

(iwr http://blog.stackexchange.com/).Content >index.html`

or

(iwr http://demo.mediacore.tv/files/31266.mp4).Content >video.mp4

This is basically Warren's (awesome) V3 one-liner (thanks for this!) - with just a tiny change in order to make it work in a V4 PowerShell.

Warren's one-liner - which simply uses wget rather than iwr - should still work for V3 (At least, I guess; didn't tested it, though). Anyway. But when trying to execute it in a V4 PowerShell (as I tried), you'll see PowerShell failing to resolve wget as a valid cmdlet/program.

For those interested, that is - as I picked up from Bob's comment in reply to the accepted answer (thanks, man!) - because as of PowerShell V4, wget and curl are aliased to Invoke-WebRequest, set to iwr by default. Thus, wget can not be resolved (as well as curl can not work here).

eyecatchUp

Posted 2011-11-28T09:56:40.167

Reputation: 165

4

Here is a PowerShell function that resolves short URLs before downloading the file

function Get-FileFromUri {  
    param(  
        [parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
        [string]
        [Alias('Uri')]
        $Url,
        [parameter(Mandatory=$false, Position=1)]
        [string]
        [Alias('Folder')]
        $FolderPath
    )
    process {
        try {
            # resolve short URLs
            $req = [System.Net.HttpWebRequest]::Create($Url)
            $req.Method = "HEAD"
            $response = $req.GetResponse()
            $fUri = $response.ResponseUri
            $filename = [System.IO.Path]::GetFileName($fUri.LocalPath);
            $response.Close()
            # download file
            $destination = (Get-Item -Path ".\" -Verbose).FullName
            if ($FolderPath) { $destination = $FolderPath }
            if ($destination.EndsWith('\')) {
                $destination += $filename
            } else {
                $destination += '\' + $filename
            }
            $webclient = New-Object System.Net.webclient
            $webclient.downloadfile($fUri.AbsoluteUri, $destination)
            write-host -ForegroundColor DarkGreen "downloaded '$($fUri.AbsoluteUri)' to '$($destination)'"
        } catch {
            write-host -ForegroundColor DarkRed $_.Exception.Message
        }  
    }  
}  

Use it like this to download the file to the current folder:

Get-FileFromUri http://example.com/url/of/example/file  

Or to download the file to a specified folder:

Get-FileFromUri http://example.com/url/of/example/file  C:\example-folder  

user25986

Posted 2011-11-28T09:56:40.167

Reputation: 141

2

The following function will get a URL.

function Get-URLContent ($url, $path) {
  if (!$path) {
      $path = Join-Path $pwd.Path ([URI]$url).Segments[-1]
  }
  $wc = New-Object Net.WebClient
  $wc.UseDefaultCredentials = $true
  $wc.Proxy.Credentials = $wc.Credentials
  $wc.DownloadFile($url, $path)
}

Some comments:

  1. The last 4 lines are only needed if you are behind an authenticating proxy. For simple use, (New-Object Net.WebClient).DownloadFile($url, $path) works fine.
  2. The path must be absolute, as the download is not done in your current directory, so relative paths will result in the download getting lost somewhere.
  3. The if (!$path) {...} section handles the simple case where you just want to download the file to the current directory using the name given in the URL.

Paul Moore

Posted 2011-11-28T09:56:40.167

Reputation: 573

1

Use Windows 10 bash shell which includes wget once the windows feature is setup.

How to install Ubuntu bash shell on Windows:

YouTube: Running Bash on Ubuntu on Windows!

Windows Subsystem for Linux Documentation

Miloud Eloumri

Posted 2011-11-28T09:56:40.167

Reputation: 111

1Consider adding some quoted reference to this answer supporting what you state in case the link ever dies so the answer content is still available that is currently only available via that link per your suggestion. – Pimp Juice IT – 2017-09-27T03:36:34.027

0

Invoke-WebRequest with -outfile parameter expects a string, so if your filename starts with a number, and not enclosed in quotes, no output file is created.

eg. Invoke-WebRequest -Uri "http://www.google.com/" -outfile "2.pdf"

This does not affect filenames starting with a letter.

Zimba

Posted 2011-11-28T09:56:40.167

Reputation: 107

This solution is mentioned in other answers (wget is an alias of Invoke-WebRequest, and one similar to the above) – bertieb – 2018-11-27T18:27:11.227

The point of the answer was to emphasise the note. None of the answers deal with no file being created due to the syntax error. – Zimba – 2018-11-28T10:28:01.493

That should really be a comment on the other answer[s] – bertieb – 2018-11-28T13:02:42.547

This answer is not provided in other answers nor similar to the one above. – Zimba – 2019-03-23T15:19:20.410

0

If your Windows is new enough (like version 1809 or newer), there's a "real" curl available. curl has the command-Line option "-O" (capital letter O; small letter won't do the same!) The option "-O", alternatively "--remote-name" tells curl, that the saved file gets the same name as the file-name part of the URL.

One needs to start this as "curl.exe", to discern it from the Alias "curl" for "Invoke-WebRequest". Incidentally it works in cmd.exe without changes.

Using the same example as in another answer here

curl.exe -O http://demo.mediacore.tv/files/31266.mp4

(The site won't allow me to add this as a comment, since I apparently need more "reputation" for that - so it gets a new answer)

Dweia

Posted 2011-11-28T09:56:40.167

Reputation: 1

-1

This should work for you to get around the no browser initialized stuff. Note the "-UseBasicParsing" param.

Invoke-WebRequest http://localhost -UseBasicParsing

Joe Healy

Posted 2011-11-28T09:56:40.167

Reputation: 99

1(1) What is “the no browser initialized stuff”? (2) Note that the accepted answer already mentions -UseBasicParsing. – Scott – 2019-04-28T06:07:31.707