Download a file from a website in command line in Linux

2

1

I need to download a file from a website which has a URL formatted like:

http://www.server.com/downloads/1234/

This redirects to a .zip file which has to be saved. There is also a need to authenticate based on username and password.

I tried to use wget, curl and lynx with no luck.

UPDATE:

  • wget doesnt work with redirection. It simply downloads the webpage instead of the zip file.
  • curl gives the error "Maximum redirection exceeded > 50 "
  • lynx also gives the same error.

Vicky

Posted 2011-08-09T05:16:04.907

Reputation: 153

1Are you getting errors or is it simply not working? – None – 2011-08-09T05:18:56.707

2How did you invoke lynx/curl/wget? What error did you get? – Noufal Ibrahim – 2011-08-09T05:19:28.260

I have updated the question.. – Vicky – 2011-08-09T05:44:32.637

1If you have to log in and then the site redirects then you'll have to authenticate (submit your user/pass), probably a POST, then use the cookie the is returned in the next request. The curl error you're getting is probably you missing a parameter or cookie in the redirection (e.g. they set a Session cookie and you're not passing it on the redirection so it redirect you away, then redirect you back...infinite loop perhaps???). – None – 2011-08-09T05:49:31.197

Answers

3

Wget does support redirect, does this work?

wget --user=USER --password=PASS --max-redirect URL

Note that if the site does not implement HTTP Authentication (require a form submission via GET or POST), then you'll need to setup some more work (given the little information you give about the site where the file is, a good answer is hard to give)

wget --post-file=datafile --keep-session-cookies --max-redirect URL

and your datafile could look like

username=USER
password=PASS

where username and password should be the name of the form fields being submitted.

NOTE : the site needs to redirect to the actual Zip file and not some other page where there is a link to the Zip file. If this is the case, you'll need to write a parsing script because neither Curl or Wget will help you there. They are tools to get content off of a given Url, they are not meant to "guess" you want you want them to fetch; they simply download what the server sends.

Yanick Rochon

Posted 2011-08-09T05:16:04.907

Reputation: 932