2

I work as a developer in a small company and am far from being a linux-pro or admin. Nontheless I have set up a Ubuntu Server for documentation purposes (LAMP with wiki) and versioning (SVN).

Our other environment is completely Windows-based. I want to backup both, the SVN-repository and the mySQL database. For that purpose, our admin has set up a share on our backup-server.

What I want to have is an convenient & easy way to pack all stuff that is needed (for restoring the wiki and the SVN-stuff) and put it in that shared directory (that gets backed up on a regular base).

What do I need to back up?

  • SQL-dump
  • SVN-directory
  • ???

How do I do this?

  • I know how to create a SQL dump and save it to a directory of course but what next & what else?

How do I automate these tasks?

  • I know how to do it under Windows but not Linux
Michael Niemand
  • 169
  • 3
  • 17

3 Answers3

1

You're going to need to backup the files for the Wiki and SVN as well as the databases. I would strongly advise that you keep more than one backup. If you have just one backup that you overwrite each day you can easily end up replacing your only backup with corrupt data should something happen to your data. I'd suggest a rolling 7-day backup strategy, say using a Monday, Tuesday etc. backup. That way you have a week to notice there is a problem before you replace your last good backup with corrupted data.

By far the simplest thing to do is to write your own backup script that then gets called nightly by root's cron. Personally I use Perl for my scripts because that happens to be my language of choice, but you could use any scripting language you like. what ever you choose to write it in, you'll end up using the same tools:

  1. tar - for creating an archive
  2. mysqldump - for exporting the DBs
  3. SCP/SFTP/rsync - you'll need one of these tools for pushing the data to another machine once you've compiled what you need using tar and mysqldump.

If it helps, here's an anonymised version of one of my backup scripts:


#!/usr/bin/perl

use strict;

#
# Set up variables
#

my $dbname = 'xxxxx';
my $dbuser = 'xxxxxxx';
my $dbpass = 'xxxxxxx';
my $uploadsDir = '/xxxxxx/blog/wp-content/uploads/';
my $backupLocal = '/home/xxxxx/backup/blog/';
my $baseFolderName = 'blogBackup';
my $backupRemote = 'user@server:~/backup/blog/';


#
# Should never have to edit anything below here
#

#
# Step 1 - get the day of the week and set up the folder to use for the backup
#

my @weekDays = qw(Sun Mon Tue Wed Thu Fri Sat Sun);
my @time = localtime();
my $dayOfWeek = $weekDays[$time[6]];
my $backupFolder = "$baseFolderName-$dayOfWeek";

# if the folder exists, delete it
my $ignore;
if(-e "$backupLocal$backupFolder"){
  $ignore = `/bin/rm -rf $backupLocal$backupFolder`; 
}
# create the folder
$ignore = `/bin/mkdir $backupLocal$backupFolder`;


#
# Step 2 - do the DB backup
#
$ignore = `/usr/bin/mysqldump -u $dbuser --password=$dbpass $dbname > $backupLocal$backupF
older/database.sql`;

#
# Step 3 - do the uploads dir backup
#
$ignore = `/bin/tar -pczf $backupLocal$backupFolder/uploads.tar.gz $uploadsDir`;

#
# Step 4 - scp the backup to the remote location
#
$ignore = `/usr/bin/scp -r $backupLocal$backupFolder $backupRemote`;
Bart B
  • 3,419
  • 6
  • 30
  • 42
1

I would recommend using the package backupninja - it is basically a wrapper for some automatic scripts for doing backup of various services. I can use duplicity, rdiff-backup (my preference), burn DVD ISO's, etc.

sudo apt-get install backupninja rdiff-backup

And to get you started (will guide you through setting up the various parts):

sudo ninjahelper

This also allows you to add random paths to backup, when it's at it anyway. From having rescued a few machines from backup, I find it handy to be able to copy over a known /etc, /srv/http (where I keep web stuff) and database-dumps.

Once that's done, the only problem is that ninjahelper insists that you either set a root password on the remote machine - or manually transfer a SSH key (IF you do remote backups at all):

On the local machine:

sudo ssh-keygen
sudo scp /root/.ssh/id_dsa.pub YOUR_USERNAME@remote.machine.tld:backup_key

Ubuntu comes with a backup-user per default, so we add the SSH key to allow remote logins:

sudo mkdir -p /var/backup/.ssh
sudo mv backup_key /var/backup/.ssh/authorized_keys
sudo chown backup:backup /var/backup/.ssh/authorized_keys

(This is from memory, so I might be a bit off on the exact commands...)

Morten Siebuhr
  • 639
  • 1
  • 6
  • 16
  • Backupninja seems pretty cool! I installed it (getting write-access to that win-share was a pain in the ass though) and made the sql backup which worked absolutely great and very easy! But: What about all that SVN-stuff? How and what do I backup? and: Can I schedule these backlups with backupninja? – Michael Niemand Jul 28 '09 at 11:11
  • Ok, I figuered rdiff will be the way backing up svn, but I don't really get this ssh stuff right, but I will read into it. Thanks a lot! – Michael Niemand Jul 28 '09 at 12:04
  • Yes and yes. Bakcupninja has its own scheduler - check `/etc/backupninja.conf` (or such). I believe it has a plugin for SVN backups, but I might be mistaken. One of the subversion-*something* packages in Ubuntu includes a hotbackup-script for subversion; that enables you (and backupninja) to do backups witout shutting down the SVN server. – Morten Siebuhr Jul 28 '09 at 14:37
0

It's hard to say what else you need to backup. It really depends on the situation. Is this Ubuntu machine only a database and SVN server or does it provide other services also?

You can create a MySQL dump by using mysqldump. As a backup solution I would recommend duplicity to you. As an easy to configure front-end you can use ftplicity. These tools will help to to backup all your data into (maybe signed and encrypted) tar archives and upload them to an FTP space or store them anywhere else.

To automate the backup process, you may have a look at cron.

Manuel Faux
  • 497
  • 3
  • 13