0
1
I have a webpage that basically consists of a list of links to other pages. Using wget I would like to download all the pages listed.
Using "wget -r -l1 URL" I basically get what I want.
But how to do the same if the list is split over several pages (with URLs ending in "?page=3", "?page=4"....).
Thanks. That could work. In fact I have several such lists, but I could perhaps just pick some very large number and see what happens if the for loop hits a non-existant page. – None – 2013-04-06T14:35:53.913