4
2
How can I fetch the images of a public website using wget? It's not a huge site: it has about 40 pages, with around three images per page.
I want to avoid:
- Manually downloading the images
- Downloading the whole site
Any ideas?
4
2
How can I fetch the images of a public website using wget? It's not a huge site: it has about 40 pages, with around three images per page.
I want to avoid:
Any ideas?
7
You can use wget and tell it to only down image files or use http://www.downthemall.net/
From http://www.delorie.com/gnu/docs/wget/wget_31.html
You want to download all the GIFs from a directory on an HTTP server. You tried `wget http://www.server.com/dir/*.gif', but that didn't work because HTTP retrieval does not support globbing. In that case, use:
wget -r -l1 --no-parent -A.gif http://www.server.com/dir/
More verbose, but the effect is the same. -r -l1' means to retrieve recursively (see section 3. Recursive Retrieval), with maximum depth of 1.
--no-parent' means that references to the parent directory are ignored (see section 4.3 Directory-Based Limits), and -A.gif' means to download only the GIF files.
-A "*.gif"' would have worked too.
1That's the spirit!! Why can't I upvote you?... – None – 2009-09-11T02:00:59.467
you'll need 15 karma points to cast a vote ... but i can :) +1 – None – 2009-09-11T02:03:57.577
1
I write a util named ifetch, you can try it. http://sourceforge.net/p/ifetch/wiki/Home/
0
Online Images Batch Download 1.1.0 (Firefox Addon)
0
see this.
The second template is all pictures of the website. I guess thats what you wanted?
I really want to do this to Stack Overflow and the SO Trilogy for their sick layout images, at sstatic.net. – Maxim Zaslavsky – 2009-12-09T02:41:34.490