0
I want to download from remote http server all files directories,files and so on. I found some solutions to ftp server,but doesn't work to http. Until now no luck with wget -r or -m. It download all direcotories in the root and the respective index.html. Not all files and sub-directory under such it(note the sub-directory may have another directory and so on)
not sure on tags fix for me if needs. Note: I'm not a native english speaker,sorry for bad english.
1
Using wget to recursively fetch a directory with arbitrary files in it - http://stackoverflow.com/questions/273743/using-wget-to-recursively-fetch-a-directory-with-arbitrary-files-in-it
– Logman – 2012-12-16T22:30:28.820or you might not have access to the file/dirs – Logman – 2012-12-16T22:33:05.080
Until at moment is working fine. I'm downloading the directories and its contents. I'd love to suggest to you: port this comment to answer and I give +1 and mark as accept if such options to wget will do what I'm looking for. Thanks very much! – Jack – 2012-12-18T16:03:13.727
you can just upvote my comment if you want, and give the guy in the link an upvote too ...thanks though – Logman – 2012-12-19T00:52:41.190
@Logman: Done! :) – Jack – 2012-12-19T01:08:55.710