1

Let's say I have two Linux servers. One containing a desktop app database (running on a local network) and a webserver.

I want to automatically transfer each hour a database file (with a size of a few MB) from the local server to the web server to make updates.

The information needs to be encrypted and the server has to be authenticated in order to make the transaction.

I was thinking on SSH File Transfer protocol. My questions are:

Is it the best option?

What other options do I have?

What existent library / script can I use to make the task automated? (in both servers)

  • scp I think is your best/easiest option. Configuration is straightforward. Setup key-based authentication so that you can script it using bash. Here's an answer on how to do that. http://superuser.com/a/169815 – RoraΖ Aug 18 '14 at 15:44

2 Answers2

2

The best way is to use SCP with SSH keys. It's some of the best encryption commonly available, designed to work without intervention, very reliable, easy to script, and extremely well documented. Set up your keys and a cron job.

The only other good option is to change your application design to sync the data over an encrypted channel, such as a mysql slave server using SSL (actual solution depends heavily on your app).

For scripting it, write up your own shell script:

#!/bin/sh
SOURCE="/path/to/source"
DEST="user@ip.ad.dr.ess:/path/to/dest"
KEY="/path/to/sshkey"
LOG="/path/to/logfile"

scp -i $KEY $SOURCE $DEST >> $LOG 2>> $LOG
cscracker
  • 314
  • 1
  • 3
0

+1 for SCP, but you can also use rsync. Very similar, and you can be as granular as you like with regards to file types, exclusions, etc.

James Spiteri
  • 251
  • 1
  • 2