2
1
I requested a Google data takeout in 2GB chunks. However, while most of the files (22/23) are <=2GB, one is over 30GB. I don't know why this is the case, and I don't know what's in it and why it's so big, but I expect it to take a while to download.
I am concerned that I may lose connection or my computer might restart while I am downloading it (to an external hard drive). I am hoping to find a way to download this giant ZIP folder in a way that won't fail if something happens. I looked at this question, but the only answer links to some proprietary software I've never heard of, and I don't really want to download anything through it, as I'd imagine the 30GB+ file has sensitive data somewhere.
This answer looks promising, but it comes with some disclaimers that it might not always work, which is not something I'm willing to commit to unless it's the only option because if it fails, I would need to repeat the entire download.
Another answer suggested using wget, which is probably the most promising. I'm using Windows, but wget does have a Windows version. However I would imagine Google has some pretty major authentication for Takeout downloads which I don't think I'd be able to pass through command line. Additionally I don't know if there would be any issues downloading to an external (encrypted) drive if the computer loses power or the drive gets knocked off.
Does anyone know if there are any alternative ways to do this that is more reliable for what I'm trying to do and doesn't use any random software?
1PowerShell and wget are perfectly capable of doing this. – Ramhound – 2018-03-23T00:19:44.237