0
I have 700K+ .jpg files on my linux server in a two-level structure, with 6000+ directories at level 1 and the individual .jpg files distributed amongst those directories on level 2. These files take up 16GB according to du -ch | grep total
. There is 3.5GB free space on the disk.
I'm looking for a sensible way to copy these files to a windows machine, and to update the windows copy at regular intervals with new files from the linux server.
I've tried FileZilla, but it only managed ~100K files in an hour, with the load avg on the linux server at around 2. That's both too slow and taking too many resources. With 10 connections FileZilla only managed ~150KB/s on a 100Mbps line.
I'm hoping it's possible to use tar
of individual directories in some fashion to get "bigger chunks" while not filling up the server disk..?
Coming from the other direction, you could share the Linux source folder as an SMB share and then use ROBOCOPY from the Windows machine to keep it mirrored.
Robocopy mirroring: http://improve.dk/simple-file-synchronization-using-robocopy/