12
1
I know you can download webpages recursively using wget
, but is it possible to do a dry-run? So that you could sort of do a test-run to see how much would be downloaded if you actually did it? Thinking about pages that have a lot of links to media files like for example images, audio or movie files.
1Don't forget the
-nd
(don't create directories) flag if you use it with the-r
(recursive) flag. – Skippy le Grand Gourou – 2014-01-10T19:29:33.597