Transfer large amount of images (30GB) to virtual server from slow, old, linux machine?

1

I need to transfer 200,000 images to a windows EC2 instance from a really old linux machine. I don't have much experience with linux, so I haven't tried straight up FTPing because I don't think we have the proper software, if it's a quick and easy download I'm open for a solution, but this thing is really slow and annoying, especially for a linux foreigner. I've tried transferring the images to a USB drive to upload from a different PC, but it freezes up and stops after about 1GB. What would you suggest the most effective way to accomplish this would be?

Ryan Gedwill

Posted 2016-05-24T03:46:57.790

Reputation: 11

Answers

2

No matter what actual method of copy/transfer you use, you should pack them into an archive first. That will make sure 2 things happen: the new file you want to transfer is good-to-go (compared to many files that could be unreadable) and the transfer itself will go a lot smoother. Small files are a pain to any type of copying or transferring. Once you have a big archive file, you can use any valid method, including scp mentioned above.

Overmind

Posted 2016-05-24T03:46:57.790

Reputation: 8 562

1

There is no right answer here - for the most part, file transfers do not use a lot of resources other then bandwidth, so the speed of the machine is probably not to big a deal.

The answer depends on what variant of Linux you have, and how you connect between the client and the server. The most obvious solution - if you have ssh on it is to use rsync - Rsync comes with most distros, or is trivially added by something like "apt-get install rsync" or "yum install rsync". The nice part about using rsync is that if the download fails partway through you just run the command and it will pick up where it left off.

If Rsync is not an option, The next logical solution would be to use scp - to use this you would use "scp -r serverip:/path /destpath". This will work as long as the server has ssh.

If that does not work, try using wget or ncftp to download via FTP. FTP is not a great protocol though.

davidgo

Posted 2016-05-24T03:46:57.790

Reputation: 49 152

0

Rereading the problem , turns out the solution is different from what I'd suggested initially. We need something crossplatform, handles file transfers reliably and is dead simple.

bittorrent sync should work. The linux client is a simple, single binary with its own webui. Set up a share there, then use the windows client on the other side to download the directory.

Since the underlying protocol is bitorrent, it should handle any issues, and check for errors fairly gracefully, and there's pretty much no config needed, outside creating the shares.

Journeyman Geek

Posted 2016-05-24T03:46:57.790

Reputation: 119 122