0

I have a rough idea of how this may work, but not sure if its possible as I have never done something like this before. So I was wondering if anybody who is experience can help me out.

I currently have one dedicated machine, which I will call "Machine A" with about 100GB of data. I would like to be safe and backup the 100GB of data every day or maybe every few days, whichever works best. So my idea is to buy another machine off-site and store the backup data incase Machine A's hard drive fails.

However, downloading the content to the new machine (will call it Machine B) everyday manually is quite time consuming and repetitive. So I was wondering if it is possible to make Machine B automatically download certain directories off of Machine A? Maybe a cron job using Rsync?

Also, since it will be downloading a lot of data, how would I go about deleting the old backups but still keep the newer ones automatically?

If using cronjob isn't efficient and you have another way to do it easily that you can tell me, I would appreciate it a lot. Thanks in advance!

Other notes: I'm using CentOS on Machine A, and probably CentOS or Debian on Machine B.

John
  • 1

1 Answers1

1

Yes, that's possible with cron jobs, it's a very common approach to do backups. All you need is a set of SSH keys to avoid password prompts and of course, SSH/rsync access at least in one direction.

Depending on your need, you can keep only one set up backups (with the --delete... options of rsync) or multiple differential copies with the help of the hard link options of rsync, but for this, better look into helper tools like rsnapshot.

Sven
  • 97,248
  • 13
  • 177
  • 225
  • 1
    See also: http://serverfault.com/questions/545751/rsync-for-generating-daily-backups-with-some-sort-of-history/545755#545755 – fukawi2 Oct 17 '13 at 02:59