1

A website I've been helping out with is hosted on a web host (not my choice) that only provide FTP to files (no shell access) and the database can only be accessed from the host itself or phpMyAdmin.

The website is running on Apache with PHP version 5.3.13.

Since bad and unforeseen things happen to websites from time to time, I'd like to have a backup of the website. I of course have most of the infrastructure locally already (database schemas and website source files), but if the website got hacked or something else bad happened, I would be left with a blank website. So I of course need to back up all user uploaded media and all data in the database.

Problem is I'm really unsure how to do this in a good way when all I have is FTP access and no direct access to the database. Any advice on how to get a decent automatic backup of this site that can be run periodically without too much trouble?

Svish
  • 6,627
  • 14
  • 37
  • 45

5 Answers5

1

You can get it to work with a little bit of hacking around (if ` doesn't work below, try exec, system, passthru, anything you can). You can also disable these restrictions if you work a little at it, but I don't think this is the place to discuss them.

The script would be as follows:

<?php
`mysqldump -uUSERNAME -pPASSWORD DATABASE > dump.sql`;
`tar -cf backup.tar .`;
// Download backup.tar
unlink('backup.tar');
?>

Of course, this is basic but it's just to give you an idea. To secure it up, have a backup/ directory that is password protected with a .htaccess setting, so that only you can download them with the password and you place backup.tar in there.

If you have these level of "shell access", the possibilities are endless.

Jay
  • 6,439
  • 24
  • 34
0

You can use a cron-service (see http://www.cronjobservices.com/ or your host may offer one) in conjunction with MySQLDumper (http://www.mysqldumper.net/) if you don't have access to a shell.

If you have access to a shell (I don't think so from your description) I'd recommend using mysqldump and cronjobs.

For the files content you may create a php script that zips up everything but you'd still need to download those files yourself.

Anin
  • 3
  • 3
Martin Müller
  • 139
  • 1
  • 7
0

phpMyAdmin is capable of exporting a database, though size and timeout constraints may make this unrealistic.

If you have access to the MySQL data files through FTP, then stopping the database then copying the files off would probably be your best bet.

Hyppy
  • 15,458
  • 1
  • 37
  • 59
0

Given your setup, the automatic backup is going to be difficult. Does the web host provider have any backup options? Ability to run cron/scheduled jobs? You probably already know, but phpMyAdmin can execute a mysqldump, and with FTP, you can grab your files, but that is not automatic.

In theory, you could programmatically FTP to your web host account and download files, and script out the download of a MySQL data dump, but that is inefficient use of time and resources, and isn't a proper backup, IMO.

Another option you might consider is mirroring your site locally or to another location of your choice using wget, curl, etc. This will give your a static version of your site with the static files and help your restore your site if you need to. This can be setup via cron on your local system, but of course, the data to restore the db is not included with this option.

To be able to restore to a point in time, you're looking at regularly taken backups/snapshots -- something the host provider would have to give you.

A few parting thoughts:

  • I would be concerned if my web host only provided FTP access as FTP transmits credentials in clear text.
  • If you expect bad things to happen, make a list of them, and see how you can address them. E.g., sftp over ftp, strong passwords, limited access to components, etc.
  • Time to look for a new web host? (_:

HTH and Good luck!

KM.
  • 1,746
  • 2
  • 18
  • 31
  • A different web host would indeed be nice! Just curious if there are any clever solutions to a restrictive environment like this one :) – Svish Jun 12 '12 at 16:04
0

You can manually dump the db in phpMyAdmin to a file on the server, so you avoid http timeout issues with the export. I would just triple-check the web host's control panel for an option to do the db dump to server on an automated schedule. Obviously, that would be much better! Either way, once the db is dumped to the server, you can back the dump up with the other application filesystem data.

There are some programs that allow you to essentially do an rsync-style incremental backup via ftp. Otherwise, you're going to be stuck doing a full backup each time. Please see this ServerFault question. My experience with curlftps, mentioned near the bottom, is that the timeouts/slowness can make the backup of mid-sized files (e.g. over 20MB) difficult - it has trouble reading the end of the file. But, YMMV. Good luck!

Nada
  • 986
  • 7
  • 9