How can I fetch the images of a public website using wget?

4

2

How can I fetch the images of a public website using wget? It's not a huge site: it has about 40 pages, with around three images per page.

I want to avoid:

  1. Manually downloading the images
  2. Downloading the whole site

Any ideas?

Juan Charrasqueado

Posted 2009-09-11T01:30:44.900

Reputation:

I really want to do this to Stack Overflow and the SO Trilogy for their sick layout images, at sstatic.net. – Maxim Zaslavsky – 2009-12-09T02:41:34.490

Answers

7

You can use wget and tell it to only down image files or use http://www.downthemall.net/

From http://www.delorie.com/gnu/docs/wget/wget_31.html

You want to download all the GIFs from a directory on an HTTP server. You tried `wget http://www.server.com/dir/*.gif', but that didn't work because HTTP retrieval does not support globbing. In that case, use:

wget -r -l1 --no-parent -A.gif http://www.server.com/dir/

More verbose, but the effect is the same. -r -l1' means to retrieve recursively (see section 3. Recursive Retrieval), with maximum depth of 1.--no-parent' means that references to the parent directory are ignored (see section 4.3 Directory-Based Limits), and -A.gif' means to download only the GIF files.-A "*.gif"' would have worked too.

user10547

Posted 2009-09-11T01:30:44.900

Reputation: 1 089

1That's the spirit!! Why can't I upvote you?... – None – 2009-09-11T02:00:59.467

you'll need 15 karma points to cast a vote ... but i can :) +1 – None – 2009-09-11T02:03:57.577

1

I write a util named ifetch, you can try it. http://sourceforge.net/p/ifetch/wiki/Home/

user101906

Posted 2009-09-11T01:30:44.900

Reputation: 11

0

Molly7244

Posted 2009-09-11T01:30:44.900

Reputation:

0

see this.

The second template is all pictures of the website. I guess thats what you wanted?

Lazer

Posted 2009-09-11T01:30:44.900

Reputation: 13 841