How to wget a list of pages linked to from a list that is spread over many pages

0

1

I have a webpage that basically consists of a list of links to other pages. Using wget I would like to download all the pages listed.

Using "wget -r -l1 URL" I basically get what I want.

But how to do the same if the list is split over several pages (with URLs ending in "?page=3", "?page=4"....).

user1583209

Posted 2013-04-06T13:20:58.833

Reputation:

Answers

1

If you know the number of pages, you could use a for-loop:

for i in {1..5}; do wget -r -l1 URL?page=$i; done

etagenklo

Posted 2013-04-06T13:20:58.833

Reputation: 391

Thanks. That could work. In fact I have several such lists, but I could perhaps just pick some very large number and see what happens if the for loop hits a non-existant page. – None – 2013-04-06T14:35:53.913