2
1
There is a website, which looks like:
Index of /2010/fall/lectures/
Name Last Modified Size Type
Parent Directory/ - Directory
0/ 2011-Feb-15 12:35:17 - Directory
1/ 2011-Feb-15 12:12:35 - Directory
10/ 2011-Feb-15 11:42:48 - Directory
2/ 2011-Feb-15 12:12:39 - Directory
3/ 2011-Jun-18 10:48:50 - Directory
4/ 2011-Feb-15 12:12:44 - Directory
5/ 2011-Feb-15 12:12:46 - Directory
6/ 2011-Feb-15 12:12:48 - Directory
7/ 2011-Aug-01 23:07:15 - Directory
8/ 2011-Feb-15 12:12:52 - Directory
9/ 2011-Feb-15 11:42:49 - Directory
In each directory, there are some files.
I know in Firefox, there is "DownloadThemAll! Tools" that can download all files under current directory. But I don't know how to easily download files in each directory without manually clicking into each directory and then using the previous mentioned tool.
Also I only would like to download pdf and zip files, not those large mp3 and flv files. "DownloadThemAll! Tools" can achieve this by filters. But I don't know how to do this for files in each directory.
Thanks and regards!
1
There is a wget binary too, which I use: http://gnuwin32.sourceforge.net/packages/wget.htm (I only get the binary, put it into system32 and done.)
– Apache – 2011-10-23T09:30:07.377Barend: Thanks! (1) I would like to download only pdf and zip files, not those large mp3 and flv files. In Firefox, "DownloadThemAll! Tools" can achieve this by filters. Can wget achieve the same? (2) In wget, can I control the max level of directories for recursive downloading? – Tim – 2011-10-23T22:44:30.800
Yes, and yes. Try running
wget --help
. The number of levels of recursion is controlled with-l
and file filters can be passed using--accept=pdf,zip
or--reject=mp3,flv
. – Barend – 2011-10-24T07:04:46.970