20
14
I have about 200,000 files that I am transferring to a new server today. I haven't done anything on such a large scale before, and wanted to get some advice on how I should go about this. I am moving them between two Centos 6 distros, and they are in different locations in the country. I don't have enough HDD space on the original server to tar up all of the directories and files into one massive tarball, so my question is how should I transfer all of these files? rsync? some special way of using rsync? Any input/suggestions on how to do it would be amazing.
Thanks
EDIT: For those wondering, i HIGHLY suggest using a screen
when running a large rsync
command like this. Especially when something silly may occur and you lose the connection to the server A which you are running the rsync
command from. Then just detach the screen and resume it later.
4Have you tried
rsync
yet? Maybe on a small set of files or so? Should be the ideal tool for that. – slhck – 2013-03-26T18:14:51.173It's almost certainly not the best tool for this job, but you may be interested in the fact that you can stream tar through an ssh connection rather than having to compress to a file before moving the file:
tar cz | ssh user@example.com tar xz
– Aesin – 2013-03-27T01:00:36.4472it could be off topic, but (especially for an initial load, and then using
rsync
for subsequent updates) : "Never underestimate the bandwidth of a station wagon full of tapes" (ie: have you considered placing a 2nd hd (or plug a usb2/usb3 disk), backup on it, and send that one via fedex to the remote location? It could be MUCH faster than anything else, and save bandwidth for other uses. – Olivier Dulac – 2013-03-27T09:10:59.983I don't have any BW limits on one provider, and the other I won't reach this month. So I don't really have an issue wasting it :P – MasterGberry – 2013-03-27T15:54:21.660
1
@OlivierDulac http://what-if.xkcd.com/31/
– Bob – 2013-03-28T10:07:43.570