1
0
With
wget --recursive --no-clobber --page-requisites --html-extension \
--convert-links --restrict-file-names=windows \
--domains example.com --no-parent example.com
can I dump a whole website for offline viewing.
The website I want to dump requires me to login, so I could write a Bash script that saves the cookie and then loads it in again, but it is time consuming.
I suppose Scrapbook for Firefox would be able to do it, but it doesn't run on a modern Firefox.
Question
Is there an easy way to dump a whole website that requires login?
1Extracting the cookie once from a browser, then adding it to all request should work. – gronostaj – 2017-09-04T14:20:19.670