0
When reading a webpage in Firefox/Chrome, one can press Ctrl+S to download the webpage with its contents (including CSS, JavaScript, images, etc...) for reading it offline later. I wonder if there is a way to immitate this task using command line that gives the same downloading result.
I know that there are wget
and curl
. However, as I tried them out, they don't crawl the same amout (with using Firefox and Ctrl+S of files that are needed to render the page to read offline later.
Can anyone tell me if there is any cmd tool (maybe the command line version of Firefox, Chromium?) for this purpose?
I want to keep all source files that are needed to render the page. And Ctrl+P is not a cmdline tool though. – 4253wyerg4e – 2018-09-13T00:41:43.000
right, sorry, I missed the point. – glenn jackman – 2018-09-13T00:43:26.007
and Chromium is not a command line version of firefox: it is the open source version of Chrome. – glenn jackman – 2018-09-13T00:44:32.050
I think Chromium does have an CLI, doesn't it? Like headless browsing with Firefox cmd. – 4253wyerg4e – 2018-09-13T00:45:58.147
https://www.keycdn.com/blog/headless-browsers/ – glenn jackman – 2018-09-13T01:04:00.310
wget should work, it's supposed to be able to crawl & save an entire web*site*. Maybe you just didn't use it correctly? – Xen2050 – 2018-09-13T03:27:09.473
I tried wget with several flags, but or it will recursively crawls the whole website or it won’t crawl all resources needed to render the webpage later just like ctrl+s from the browser. Could you please tell me the wget cmd that you would use? – 4253wyerg4e – 2018-09-13T03:32:03.020