2
I have a huge data set consisting of several files, with size of some of the files exceeding 30 GB. After compressing, transferring the data set over the network, and decompressing again I am suspecting something went wrong (that a file might be corrupt). One quick way to confirm this would be to compute checksums of both copies of a file---but there is a 2 GB limit on the size of the file for MD5 tool in Windows PowerShell.
Is there an alternative for computing a checksum of a file whose size exceeds 30 GB? (The tool should be implemented for both Linux and Windows.)
can you
sshfs
the windows machine from the linux machine or use the (I assume samba) network access to execute the md5 command on the file from the linux machine? – Fiximan – 2015-08-14T12:55:21.390