Download from http server all directories,files and subdirectories and so on

0

I want to download from remote http server all files directories,files and so on. I found some solutions to ftp server,but doesn't work to http. Until now no luck with wget -r or -m. It download all direcotories in the root and the respective index.html. Not all files and sub-directory under such it(note the sub-directory may have another directory and so on)

not sure on tags fix for me if needs. Note: I'm not a native english speaker,sorry for bad english.

Jack

Posted 2012-12-16T22:09:03.603

Reputation: 963

1

Using wget to recursively fetch a directory with arbitrary files in it - http://stackoverflow.com/questions/273743/using-wget-to-recursively-fetch-a-directory-with-arbitrary-files-in-it

– Logman – 2012-12-16T22:30:28.820

or you might not have access to the file/dirs – Logman – 2012-12-16T22:33:05.080

Until at moment is working fine. I'm downloading the directories and its contents. I'd love to suggest to you: port this comment to answer and I give +1 and mark as accept if such options to wget will do what I'm looking for. Thanks very much! – Jack – 2012-12-18T16:03:13.727

you can just upvote my comment if you want, and give the guy in the link an upvote too ...thanks though – Logman – 2012-12-19T00:52:41.190

@Logman: Done! :) – Jack – 2012-12-19T01:08:55.710

Answers

3

Web servers don't give you a directory listing. (Not unless you tell them to do so.) SO, in the general case, this is impossible.

You can probably use rsync+ssh for this. Or FTP.

Matthias Urlichs

Posted 2012-12-16T22:09:03.603

Reputation: 166

+1 Thanks for your reply. wget application is doing the job with -r --no-parent --reject "index.html*" flags,as was suggested by the Jeremy Ruten's answer in the stackoverflow linked by Logman. I forget to mention,it's a http server,but do the files directories listing exactly as a FTP server. – Jack – 2012-12-18T16:08:42.827