How to send huge files from one server to another server

4

2

I need to transfer a huge file (more than 70GB) from one server in Canada to another server in Africa.

I tried FTP, but always get disconnected somewhere. And I guess the network is not stable for the African server, some of the file got uploaded, but when I open it, it is corrupted.

So I really need a tool to transfer the huge file:

  1. Utilize the bandwidth, so I can upload it ASAP.
  2. If network not stable, the tool can detect it, drop the damaged data and upload it again.
  3. I don't want to upload to other place (like most of the file sharing solution, upload to their server, and get one share link)
  4. I prefer a tool which can be installed on both of the servers.

The OS is Windows Server 2008 R2 for both the servers. I cannot use the 3rd party transfer channel.

Robby Shaw

Posted 2012-10-01T22:35:54.077

Reputation: 213

4I like rsync for this purpose. Give it the --partial flag and it will save any progress if it's disconnected. – David Schwartz – 2012-10-01T23:12:27.800

1What OS on both sides? – grs – 2012-10-02T00:13:12.280

The suggested exact duplicate is not, as it doesn't address the questioner's restriction against using a third-party service to facilitate the file transfer. – herrtodd – 2012-10-02T18:25:32.480

@grs, windows server 2008 R2 – Robby Shaw – 2012-10-02T22:53:01.737

After 5 days you want to send another 70gb. Are these 100% different from the initial 70gb? How much data changes in these 5 days? If the delta is relatively small it may be wise to airmail the first 70gb and then come up with a scheme to just transmit the deltas. – akira – 2012-10-03T04:24:47.200

@akira, Thanks for mention that. But I am not sure which part is changed. The system write data to anywhere in these files. – Robby Shaw – 2012-10-03T05:32:07.957

@RobbyShaw: can you connect to a "network drive" on the remote machine? – akira – 2012-10-03T17:22:47.723

@akira: yes, I can remote control the other server. – Robby Shaw – 2012-10-03T20:59:12.990

there are multiple solutions to transfer only changed files out of the 70gb pool of files. this should make transfers in the future much faster. http://en.wikipedia.org/wiki/Remote_Differential_Compression

– akira – 2012-10-04T06:50:35.760

Answers

12

BitTorrent might be a good solution for you.

Create the torrent on the origin server, transfer the torrent file to the destination, and then use one of the clients as the bittorrent tracker. Bittorrent breaks up the file into manageable pieces and makes sure they're all transferred without error.

There's obviously some overhead in creating the torrentfile and there's no swarm to enhance your download speed, but if you're dealing with a crappy link, it may work very well for you.

herrtodd

Posted 2012-10-01T22:35:54.077

Reputation: 716

That might be a good solution because the torrent client would automatically detect corrupt pieces and dropped connections and just keep on going until it’s done. One side would have to set up a tracker, but µTorrent has a built-in mini-tracker, so that’s not a problem. You can create the torrent file (which is tiny), set it to private, tweak the network settings (you don’t want to limit it unnecessarily) and let it run to completion. – Synetech – 2012-10-02T00:51:13.037

That's looks great. But is there any security risk? – Robby Shaw – 2012-10-02T23:00:03.943

You guys save my life. It works and everything just great. However, it took some times for me to set up the torrent file. I have to change the advance setting first (enable tracker), then change the tracker address to my local address and port. – Robby Shaw – 2012-10-03T06:06:32.813

1@Robby Shaw - Just to satisfy my personnal curiosity and knowledge, how much time did it took to transfer the 70GB this way? – laurent – 2012-11-07T16:55:02.737

2

FTP is awful -- it was born in the mid-1980s and, frankly, it should've died there.

I'd probably start with scp (Secure Copy), which should be part of the openssh or openssh-client package on your favorite Linux distro (including Cygwin), or available as part of the PuTTY package if you're running Windows without Cygwin. You'll need to configure an ssh server on the destination host, but that's pretty straightforward assuming you've got root/Administrator access (if you don't, things get harder); once you've got the ssh server running, and can access it from the source host, it's just a matter of

 user@source $ scp /path/to/file user@destination:/path/to/receiving/directory

This should satisfy your point 1 pretty well, since scp has reasonably low overhead; it should certainly satisfy point 2, as it will definitely detect a failing connection and can be (probably) configured or (certainly) scripted to retry as many times as necessary; it covers point 3 easily, as no intermediate host or service is required; and it covers point 4 nicely as well, since you can install an ssh server on both hosts and then transfer the file in whichever direction you prefer. You also get encryption for free, which may or may not be of use to you.

The OpenSSH manual is probably a good place to start, and I'll be glad to offer further assistance if you do end up going this route -- I have some experience in using scp/ssh for these sorts of transfers (although not from Canada to Africa or vice versa, and not for a single file topping 70GB in size, I admit!)

Hope this helps!

Aaron Miller

Posted 2012-10-01T22:35:54.077

Reputation: 8 849

2

I think it would be a good idea to split the file into several small ones, transmit them, and then put them together again on the remote server.

An example (on Linux) of how to split and concatenate is here: http://www.techiecorner.com/107/how-to-split-large-file-into-several-smaller-files-linux/

You should also have something like an ssh connection to the remote server so that you can concatenate them there.

Helpful would also be a tool like "md5sum" to check if the transmitted files are unchanged by comparing the hashes.

You could either write a small shell script to automate some stuff, so that you can painlessly transmit a lot of small files, or you make it manually and try it with few but bigger chunks...

Boris Däppen

Posted 2012-10-01T22:35:54.077

Reputation: 616

Yup, if no good tool for this. I will probably use your method. But since the file is more than 70GB, i need to split it to a lot of files and check the MD5 sum for all of them. It sounds too much work for me :) – Robby Shaw – 2012-10-02T23:03:45.903

2

Depending on your upload and download speeds (usually, upload is the problem), your best way is to write it on a HD and send it via Fedex, DHL or similars for someone to copy it to the server if possible.

For example, if your upload speed is 1 Mbps, you will upload 1M / 8 = 128 KB/s. So, not counting any "problems" like overhead due to encryption (using scp for example) or connection not 100% full rate, you file will need 70G/128K = more than 500.000 s or 160h (more than 6 days). If your connection is not very stable, it will take (possibly a lot) more time.

laurent

Posted 2012-10-01T22:35:54.077

Reputation: 4 166

Because we will transfer the files regularly (every 5~10 days), so I don't think sending a HD is doable. My upload speed is quite fast, but it seems the other site got packet lost – Robby Shaw – 2012-10-02T22:56:21.950

1

I'd consider putting the file onto a set of DVDs or BD-ROMs or 2.5" HDD and airmail them.

If your upload bandwidth is 1 Mbit/s, 70 GB could take 6 days to transfer over the Internet.

RedGrittyBrick

Posted 2012-10-01T22:35:54.077

Reputation: 70 632

Because we will transfer the files regularly (every 5~10 days), so I don't think sending a HD is doable – Robby Shaw – 2012-10-02T23:00:54.470

1

put the file on the ftp site as you tried, but from Africa use

wget -c ftp.server.com/filename

-c will resume interrupted download

jet

Posted 2012-10-01T22:35:54.077

Reputation: 2 675

FTP is totally out of date – davidbaumann – 2015-11-11T05:41:29.413

@davidbaumann "Says the guy replying to a comment from 3 years ago" - Another guy replying to a comment from 3 years ago. – Phillip Copley – 2018-06-01T15:42:50.790

1

If you're using Mac OS X or Linux on both sides (?), then rsync might be your best bet.

Check out the manual pages here.

Leif

Posted 2012-10-01T22:35:54.077

Reputation: 400

Sorry, forgot to mention it is Windows OS. Thanks anyway – Robby Shaw – 2012-10-02T23:04:27.523

0

I know this is an old question, and someone's already suggested rsync, but the following concrete example worked well for me with moving huge files between servers:

rsync -r -v --progress --partial -e ssh user@remote-system:/path/to/remote/file /path/to/local/destination/

Whereby, you invoke this command on the destination side, such that rsync pulls the huge file from the remote server to a local directory. This example could be modified to snyc entire directories instead of just a single file.

nemesisfixx

Posted 2012-10-01T22:35:54.077

Reputation: 2 811

0

Mail.ru offers 100Gb of space. You could transfer your file that way. Of course, you have to be able to speak Russian, or maybe use a translating tool.

dzon

Posted 2012-10-01T22:35:54.077

Reputation: 1

-1

I had the same situation. the destination server was remote but in the same country. so I used Team Viewer - File Transfer. It worked perfectly on windows server.

Abdul

Posted 2012-10-01T22:35:54.077

Reputation: 99