1

We're moving servers, and I need to transfer all the data from Server A to Server B.

I have a tar.gz of about 100Gb that contains all the Server A files.

I'd really like to avoid downloading the file locally on my computer, and uploading it to Server B.

I only have ftp access to Server A. That means no ssh. However, I do have ssh access to Server B.

What's the best way to transfer the file ? I was thinking of moving my tar.gz file to public_html temporarily, and download it using wget. Would that work ? Otherwise, I could use ftp through an ssh session on Server B.

2 Answers2

2

Something like:

ssh user@serverB
nohup wget -bqc ftp://path/largefile.tar.gz

wget options:

-b : run in background
-q : quiet
-c : resume broken download (means you can restart if it breaks)

This runs wget in the background so (hopefully) if you exit the ssh shell it'll keep going. Ok, I think you need nohup to ensure that is the case when/if you logout

Because you're initiating the download from serverB, your desktop machine isn't involved in the file transfer except to set it up.

hookenz
  • 14,132
  • 22
  • 86
  • 142
1

If it's not very sensitive data and your connection is safe enough, ssh into B and download straight from A via ftp. SSH will make your download considerably slower because of the encryption work overhead. If possible split the 100GB file in multiple ones, especially if the ftp server on A doesn't support download resume.

MadHatter
  • 78,442
  • 20
  • 178
  • 229
stoned
  • 808
  • 5
  • 10