0

I'm running a Amahi server. It's basically a Fedora14 x64 installation. I'm looking for a good solution to backup my 200GB system drive on the server to an external USB/eSATA drive every night. I looked into using dd but since other things might be running on the server at the same time it didn't feel quite safe. I would like the backups to be incremental so the following backups after the initial one would be quite fast. The backup should also be bootable or prehaps be able to produce a bootable disk after booting from a CD or something.

I would also like the server to be able to do similar backups of my clients running Ubuntu, Windows 7 x64, Windows 7 Starter, OSX Lion, Windows XP and so on. So no applications backing up only shared folders or something like that. My guess is a client daemon would have to exist that would lock the system to allow backup of a Windows system drive that can otherwise be quite cranky.

Booting up a CD in a crashed client and connecting to the server restoring the latest backup and being up running is my ideal goal.

Is there anything out there that would fit these needs?

Bryan
  • 7,538
  • 15
  • 68
  • 92
inquam
  • 169
  • 13

2 Answers2

2

I would say you need a different backup solution for each operation system.

For restoring linux you usually do not need an image (though it can be faster). Saving all files with permission is enough to restore a bootable Linux. Thus I suggest using BackupPC or a client initiated backup using rsync/tar/dar. You can create consistent Linux backups using LVM. If you did not use LVM already you need to reinstall. See BackupPC+LVM.

I do not know much about Macs but TimeMachine seems to be good enough for your purposes. You can through some hackery use Linux as a TimeMachine target. If you want the plug'n'play(TM) Apple experience you should buy a real time maschine box ;)

For Windows you can either use the integrated image backup (in Win 7) to a network share on your home server; Go for Windows Home Server (which has such a client you mention). Or you go with UrBackup, that has that CD restore over network.

UrOni
  • 235
  • 1
  • 4
  • seconding backuppc. I've used it before as a third party and recently set it up myself on my own network. It's very very good imho. Highly recommended. It'll also backup linux, windows, and mac (and its own localhost) easy enough. – Sirex Sep 05 '11 at 11:03
0

Getting a perfect snapshot while the unix systems are live is hard. You could reboot/change to a runlevel where not much is running, to get everything into a safe state. Otherwise, you're going to always have some risk of backups not quite matching - as you say yourself.

For backing up unix (linux and os x) incrementally, you might look at rsync. There are various higher level wrappers around it, but basically:

rsync / destsrv:/mnt/backup/snap-20110905 --link-dest=/mnt/backup/snap-20110904

will create a new tree under /mnt/backup/snap-20110905 containing all the files, but where they have not changed from the previous backup (specificed in link-dest) then they will be hard-linked to that directory rather than copied. So you'll keep yesterday's snapshop and today's snapshot, with them sharing as much as can be shared. So it matches your incremental requirement.

This won't give you a bootable snapshot - though you'll be able to easily copy the files onto a new drive and make it bootable. That is your secondary bootable requirement.

rsync will run over ssh, so as long as you can ssh from your client to your backup server you can back up a number of unix based systems onto the same backup server.

I have no idea how that would work under Windows (cygwin, for example). My assumption without further investigation is that it wouldn't work well at all. I think it would be fine for 'data-like' files (eg word docs) that don't have complicated file system permissions/extended attributes but probably not OK for system files.

Ben Clifford
  • 256
  • 1
  • 6