0

I inherited a backup plan from the previous sysadmin that looks like this:

Backup each 150GB of data from two fileservers to two 3.5in external harddrive (USB 2.0) by mapping the data partitions to our quad-core Dell R200 server (Windows Server 2003) and runs ntbackup (full backup running on thursday night). Each backup takes 9 hours to complete. The external hdds will be taken home by the COO every friday (our company is closed on weekends).

I've tried to backup directly from the 2 fileservers to the USB disks, but it takes more than 24 hours to complete since each of them runs on a slow processor.

I think this plan is ridiculous (is it?), but please, my question is:

Does the backup speed is highly affected by the speed of the USB port?

hsym
  • 1,003
  • 3
  • 14
  • 26

4 Answers4

1

It is. USB 2.0 is 480Mbps. Divide that by 8 to get 60Mbytes/sec, but you'll never see that in real practice due to bus contention, driver overhead, etc.

SATA1.0 is 1.5Gpbs (187.5Mbytes/sec), and since it's a dedicated point-to-point link you get to use all that bandwidth. Heck, around 2000 is when PATA UDMA-66 came out, which, at 66Mbytes/sec, matches USB 2.0.

Add to the mix that USB controllers need the CPU to do a lot of the work of transferring data, and yeah, things can be slow.

You should check out external SATA (e-SATA).

LawrenceC
  • 1,192
  • 6
  • 14
  • Thanks for the speed comparisons. Our external HDDs doesn't have e-SATA port and the server doesn't have e-SATA port either. I'm thinking that the whole backup plan is just silly due to the slow speed of the backups. I think we need something different. And you've probably guess it, the Dell R200 server (2TB of data inside) doesn't have a backup plan at all. :( – hsym Nov 13 '10 at 19:36
  • The best I've seen as a sustained rate (read or write) to a drive over USB is ~30MByte - half the theoretical maximum. – David Spillett Nov 13 '10 at 20:18
1

The speed of the USB device will of course affect the duration of the backup. I think the theoretical limit of a USB 2.0 bus is something like 480 Mbps (60 MBps), but experience tells me that you won't achieve anything like this. If you are only copying file data, why not use robocopy, and just copy the changes (i.e.: not a full backup)? You could also use the volume shadow copy feature of windows to give you day-to-day restores of files (requires additional storage).

Simon Catlin
  • 5,222
  • 3
  • 16
  • 20
0

What content is being backed up? If it is files that lend themselves to incremental/differential backups then that would make each run much quicker (with the occasional full backup to be sure).

If you can change tools you might want to look at rsync or similar tools that only copy updated content (with rsync the --checksum option essentially forces a full update which you should do occasionally to protect against corruption of old data in the backups). If you run rsync via its own protocol rather than just backing up from an SMB share to a local drive then even this full scan will be a lot faster as you are limit to how fast the drives at either end can read+write (though if you are using a USB drive that will still be the bottleneck when using --checksum or when lots of content needs updating).

I've used rsync to keep large chunks of data backed up over relatively slow ADSL links for years now (so if you could arrange something like that there would be no need for your COO to manually transport drives to/from home) and found it to be reliable and efficient, and there are similar tools like rdiffbackup that operate in a similar way. One thing to note though is that if you keep an on-line off-site backup like this, you should still keep a off-line backup too like the external drives.

David Spillett
  • 22,534
  • 42
  • 66
  • The content are the usual Office files and some Adobe files (PSD, AI, FLA), images, video/audio recordings from past projects. Yeah... most of the files in there are from the past projects that are seldomly accessed. – hsym Nov 14 '10 at 01:20
  • That content would definitely be best served with some form of incremental update as very few of the files change. Where I work we have several 10s of GB backed up remotely but using rsync only 10s of MB need to be transferred each night to update the off-site/on-line backups, making it practical to update the backups via a slow ADSL line. It also means we can keep many snapshots using a technique like http://www.mikerubel.org/computers/rsync_snapshots/ so have a year worth of weekly backups and a month worth of dailies taking about the same space two full backups would normally consume. – David Spillett Nov 14 '10 at 11:52
  • Even without changing tools, I'd be surprised if NTBackup didn't have reasonable differential/incremental backup options for you to investigate which would greatly improve the efficiency of what you already do. Just make sure you do a full backup regularly too otherwise restoring becomes a pain (as you will have to process the last full backup and all the incremental ones since in order to get to the latest state). – David Spillett Nov 14 '10 at 11:56
0

I average about 45 GBytes/hour on a similar setup, so I don't think USB is the bottleneck.

Greg Askew
  • 34,339
  • 3
  • 52
  • 81