10
4
I use my external HDD to back up my files, by putting them into large archive files.
I have thousands of tiny files, and put them into archives of 500MB to 4.2GB in size, before sending them to the external HDD. But, does one hard disk failure destroy the whole archive or only one file in the archive? I fear that one flipped bit could render large parts of the archive useless.
Things like CRC checks can alert you to the existence of corruption, but I am more interested in the ability to recover the undamaged files from a corrupted archive. What archive file formats would provide the best ability to recover from such failures, either through the native design of the archive structure or the existence of supplementary recovery tools? Is there any difference in this capability between zip and iso files?
Please reopen the question. I have reworded it, and it should be more clear now. "Best" will always be somewhat opinion-based, but the requirement to be best here are quite clear. Little room for personal opininons IMHO. Please delete this comment after reopening. – Marcel – 2015-02-11T13:03:11.643
I know at least one of the programs I use for file synchronization supports multithreaded copying, which I believe mitigates some of the slowness of copying lots of small files; also, though I would have to test to be sure, I have a suspicion that creating an archive of lots of small files would also take longer than creating an archive for several large files, even if no compression is used. I don't remember if this is a Windows-only issue or not, though; iirc, there are some software solutions available for Linux that can handle lots of small files in blocks, but I can't recall the details. – JAB – 2014-03-28T14:01:33.547