11

I'm a system admin with little security experience. I want to schedule the copying of our backups to protect our data in case of fire. The best solution would be to copy the data onto a new hard-disk and lock it in a safe. Practically, however, the chances of someone doing this regularly are slim. I therefore want to automate the copy to a computer/hard-disk that is far away from the building.

The data is very sensitive and the chances of a sophisticated attack are considered to be fairly high. I am therefore looking for solution that has very limited functionality. This is largely to prevent myself (or a system admin in the future) misconfiguring the solution in ignorance. In short, I want my server to be able to communicate to a well defined external computer and only this computer. Is this something that is possible to guarantee?

The company cannot afford a leased telephone line, which is the only way I currently think of to achieve this aim. (I worry, for example, that a VPN has too great a functionality, and causes me to become a security risk.)

Currently the office server (that makes the backups) is isolated from the internet. Desktop users (who do have the internet) have access to only low privilege accounts on the server. I would like to keep the server and the backup server as isolated from the internet as possible - allowing only the scheduled copy to be communicated to the outside world. Is there any alternative to VPN for this?

Finally, since the server in the distant location is only required to copy the backup onto a hard-disk - is there a way of locking this system down so that only transmitted documents are saved to the hard disk (preventing malware from accidently being installed).

Currently all the solutions seems that I have found involve either VPN or a leased telephone line. I think the first is far from foolproof and I can't afford the second. Any advice welcome.

EDIT: I'm not discounting using VPN, but I would need to convice myself (and my supervisors) that this method is near idiot proof. This is because I have not used this technology before.

Tom
  • 213
  • 1
  • 6

3 Answers3

6

You need just the most recent copy of the data, or a historic backup, like version control?

supposing you have two computers running linux (easier) or even any other OS that have the needed tools implemented and you want only the latest backup:

  • install SSH and create a tunnel between the two computers. You can choose to use the strongest key you need
  • configure firewall to only accept communications between the two computers
  • configure Rsync to make the backup between the servers, in the needed frequency (ex.: daily backup running every night)
  • the destination computer (remote one) could have disk encrypted. And he could get the keys from the source server.

And be sure that physical access is limited on the destination server.

This way:

  • at a specific time, the backup server mounts the encrypted disks asking for the key that is in the source server
  • after some minutes the rsync begins to update the files in the dest drive
  • when it's finished, you have the files in the recent state.
  • umount the drive, and you're done.

Rsync is good because it only copies what is different between the files, so it's very fast to do backups with it.

It's possible to use rsync to make incremental backups too: take a look at this site

bstpierre
  • 4,868
  • 1
  • 21
  • 34
woliveirajr
  • 4,462
  • 2
  • 17
  • 26
  • 1
    If you're using rsync and want incremental / semi-versioned backups, look at [rsnapshot](http://rsnapshot.org/). It wraps rsync+ssh in a way that you can have distinct daily/weekly/monthly backups without wasting space for duplicates. – bstpierre Oct 25 '11 at 12:11
5

Woliveirajr covers a lot of good points. SSH makes for a very good, simple, hard to screw up backup tunnel.

In addition, you can make the SSH process only come online right before the backup process - no need to expose that attack vector during the rest of the day.

You haven't mentioned the size of the data you are backing up. If the data isn't too large, I wouldn't do incremental backups. Instead I'd encrypt with one key and keep the decryption key secret. You can pile this on top of whole disk encryption as mentioned above.

And to clarify some hopefully obvious points:

  • Don't use DNS - use hard-coded IP addresses (in fact prevent dns, ntp, updates, etc)
  • Don't run any services on the backup server -- it should phone home
  • Use SSH keys to authenticate
  • Lock down physical access

Pick up the good O'Reilly book on SSH.

Bradley Kreider
  • 6,152
  • 2
  • 23
  • 36
0

Your most reliable solution is going to be client-size encryption. If you encrypt your backup data before pushing it out over the Internet, then you can be certain of its security, no matter what happens. No matter whether a connection is misconfigured, or a backup is sent to the wrong host, or the backup destination is compromised -- if your backups are encrypted opaque blobs, you can rest easy.

There are numerous ways to achieve this. The simplest is to encrypt your backups using a tool like 7-zip or PGP. Alternately, you can create an encrypted container a la TrueCrypt. Still, my favorite solution is using the --reverse option of encfs to create an encrypted view of your filesystem; then you can just rsync it across the network without having to worry about who sees it.

This is a fairly common problem, and there are numerous ready-made solutions for what you're trying to do. Usually this involves making secure backups using Amazon S3 or a similar cloud storage offering. The list of options changes from time to time, so just Google for encrypted s3 backup to get some ideas.

tylerl
  • 82,225
  • 25
  • 148
  • 226