2
I am coping a website and ^c
canceled it and if I resume, it does the entire thing over again but does not copy it over because of the exiing dirs. I would like to resume the wget -r
copy and ignore all of the already downloaded sites.
2
I am coping a website and ^c
canceled it and if I resume, it does the entire thing over again but does not copy it over because of the exiing dirs. I would like to resume the wget -r
copy and ignore all of the already downloaded sites.
2
You can pass the --continue
flag to Wget to ask it to resume the existing download.
However, even in this scenario, Wget will still parse the pages and send a HEAD request for each file. It is not possible to avoid this due to various reasons that I outlined in this post on StackOverflow.
1
man wget: -c, --continue resume getting a partially-downloaded file.
Or am I wrong and it already detects this? – Xander Everest – 2018-11-20T17:24:31.147