23

It's interesting to see the technological split between structured corporate environments and more developer-driven/startup environments. Some of the Microsoft technologies I take for granted (VSS, Folder Redirection, etc.) simply are not available when managing the increasing number of Apple laptops I see in DevOps shops.

I'm interested in centralized and automated backup strategies for a group of 30-40 Apple laptops...

How is this typically done safely and securely, assuming these are company-owned machines (versus BYOD)?

  • While Apple has Time Machine, it's geared toward individual computer backups and doesn't seem to work reliably in a group setting. Another issue with these workstations is the presence of Vagrant/Virtual Box VMs on the developers' systems. Time Machine and virtual machines typically don't work well unless the VMs are excluded from the backup set.
  • I'd like a push-based backup process with some flexible scheduling options.
  • I know how to handle the backend storage, but I'm not sure on what needs to be presented to the client systems.
  • Due to the nature of the data here, cloud-based backup may not be a viable option.

Any suggestions about how you handle this in your environment would be appreciated.

Edit: The virtual machine backups are no longer important. They can be excluded from the process and planning.

ewwhite
  • 194,921
  • 91
  • 434
  • 799

7 Answers7

9

We're just trying to bring our Macs into the fold here. My original plan was to use Backup Exec's Mac agent. Then I found out that the agent doesn't support 10.9, or even 10.8. So if you're keeping the OS up-to-date, that's out. I've heard legend tell that the next SP will get it up to speed, but I'm not holding my breath.

It has been a few years, but Retrospect used to be the gold (and only) standard for Mac backup. Install the agent and you could set a schedule so the Macs would back up once connected to the network. I don't have recent experience with it, though it did work via VPN many moons ago. You'd then want to have it save the backup sets to storage that you would sweep into your existing backup environment.

If you get a Mac Mini with OS X Server, you can redirect Time Machine on the laptops to the network, then sweep that connection up with another disk backup tool. I don't know if there's any granularity to Time Machine, though -- I believe it grabs the entire disk, or nothing.

I know you mentioned cloud may not be an option, but if that is because of the VMs (which are now out of scope?), then perhaps that makes your CrashPlan/BackBlaze/Carbonite options more palatable.

If you do want to bring the VMs in scope, you could install a Windows-based agent in the VM, and treat that as you would anything else.

CC.
  • 1,186
  • 1
  • 10
  • 22
  • 4
    +1 for CrashPlanProE. It works very well and the block level dedup on the client end allows for easy local and remote backups to a private backup server. We use a ZFS + NFS backend to a linux server CrashPlanProE server front end – tegbains Feb 15 '14 at 08:22
  • @tegbains How is the cost? Which ZFS implementation are you using for the storage? – ewwhite Feb 15 '14 at 17:16
  • 1
    @ewwhite Cost is the only real issue with CrashPlan ProE (CPPE). We budget the cost to be around $60 per user per year on low volumes. You pay per user who can have up to 4 machines included in that license. There is no cost for the server portion. That allows us to use a big master server and several smaller NAS type boxes for distributed off-site backups. For ZFS we have been using OpenSolaris/Nexenta. We will probably switch over to FreeBSD/NAS 10 for the next project due to the strangeness in Nexenta's release cycle. – tegbains Feb 15 '14 at 20:56
  • Don't get me started on Nexenta's fall from grace... – ewwhite Feb 15 '14 at 21:11
  • @ewwhite I hear you about Nexenta... I am also looking at OmniOS – tegbains Feb 15 '14 at 21:13
  • Agree about Nexenta (and anything that promises large ZFS replication, which fails gracelessly). nas4free is the way to go if you want to roll your own NAS IMHO. – quadruplebucky Feb 16 '14 at 05:45
  • 1
    Time machine allows the user to exclude portions of the disk from backups. – Dan Pritts Feb 17 '14 at 17:22
7

Acronis supports Macs and a centralized backup server. Symantec also supports Macs and has a centralized appliance. There's also Retrospect, a long-time established Mac backup package that also appears to support a local backup server. I'm sure there are more. (I've intentionally excluded cloud services.)

Of course, the way we're using Acronis (for Windows!) qualifies more as business continuity rather than disaster recovery. We're using it for the users who have SSDs; when the SSDs inevitably die, Acronis gets them back up and working fast. The actual DR data is all server data and is handled differently based on whether it's client data or internal data.

You didn't explicitly state whether you were looking for business continuity answers or disaster recovery answers, but I've answered more along the lines of continuity. On the other hand, if the building burns down, perhaps your devs will have their laptops with them, so continuity is probably more of what you need.

[Edit]

I had intentionally excluded Crashplan due to the "no cloud" restriction, despite liking the home version a lot. Crashplan and Acronis are different use cases, though; Acronis does actual imaging, and Crashplan is data only (by default, the user's home directory only). Acronis is scheduled, and Crashplan is continuous (whenever the storage is available).

In our particular environment, developers are allowed to customize their machines in whatever way is most efficient for them, so they need an image level backup so they can get back up and running fast in case of emergency. If your devs use their machines the same way, they probably need an image-level backup, too. One more thing to look at in the product offerings, alas. (It looks like Acronis' Mac imaging is providing a central repository for Time Machine, but I could be misreading.)

(I've heard of home users telling Crashplan to back up their entire hard drive, including the Windows directory, but they're doing it wrong, alas, because restores would probably be wading into unsupported territory. It's all about backing up data.)

Katherine Villyard
  • 18,510
  • 4
  • 36
  • 59
  • 2
    Well, the confusion centers around the fact that Apple's Time Machine works incredibly well for standalone systems, but attaining the same level of utility for a *group* of systems seems far more difficult. I'll look into the commercial offerings. – ewwhite Feb 13 '14 at 19:05
  • Yeah. It looks like people other than you are having this issue, and Apple support says, basically, "Use Time Machine to back up to a NAS" in one link I found, but... – Katherine Villyard Feb 13 '14 at 19:14
  • 2
    One heads-up on the "Symantec supports Macs" idea from recent research...they're usually quite a bit behind on OS support. NetBackup will do 10.8, but not 10.9 yet. Backup Exec is still stuck on 10.7. – CC. Feb 13 '14 at 20:58
  • 1
    Avoid Retrospect. It was great at some point. We have been using it since version 2.0 when Dantz owned. It's out of date and not as reliable as other options – tegbains Feb 15 '14 at 08:23
  • So Symantec and Retrospect are out... – ewwhite Feb 15 '14 at 15:19
  • 2
    @KatherineVillyard We are evaluating [Crashplan's PROe offering](http://www.crashplan.com/enterprise/overview.html), which would allow us to use an internal backup server running the OS of our choice. – ewwhite Feb 21 '14 at 13:51
7

I used to use CrashPlan at a previous job to back up a couple of hundred Mac laptops, a few Windows VMs, and even a couple of Linux servers.

They have a cloud based solution, but we used the on-premise server (I think they've since renamed it to CrashPlan ProE) and it was rock solid.

I liked it enough that I use their cloud consumer solution to back up all my personal Macs.

re: Mac filesystem attributes mentioned in another answer - OS X is fully supported on CrashPlan and we never had any issues restoring Mac resource forks. You can run the server on OSX, but we ran ours on a Dell running Ubuntu.

re: Pricing - the seats are per-computer, not per-user, so if a user has a laptop and a desktop, that counts as two seats which seems reasonable. The seat price was on the low end of the range of different products we looked at.

CP has typical enterprise features as far as being able to configure how long to keep backups for (We kept hourly changes for a couple weeks, dailies for a month, then weekly for six months and monthly after that), and you can set up different organizations that have different settings. Setting up our server to auth to our LDAP took about 5 minutes, I recall being shocked at how quickly we got everything set up.

Joe Block
  • 696
  • 4
  • 12
  • How was cost for the on-premises solution? Did you ever have to recover data from backups? – ewwhite Feb 15 '14 at 17:21
  • 1
    I should note that they don't do bare metal restores, just your user data, but for us that was enough. When we had to replace a laptop we'd prep one from a golden image, configure CrashPlan with the user's credentials, then plug it into one of the dedicated gig ports on the same switch as the CrashPlan server and let it slurp down their files. – Joe Block Feb 15 '14 at 17:22
  • Cost was pretty decent. The more seats you buy at a time, the lower the per-seat was. I don't recall the exact pricing but I do remember it being toward the low end of the range. – Joe Block Feb 15 '14 at 17:25
  • We had to do a fair amount of restores (upgrading hardware and replacing stolen/broken laptops mostly) and that was painless once we set up a couple of dedicated gig ports so it could max out the download speed. – Joe Block Feb 15 '14 at 17:30
3

I use Backblaze for many of my clients and on all of my machines (well, all the Win and OSX anyway - no support for anything else)- I can recommend them highly. The downsides are that the inital backup can take a while and it can be cumbersome to do a complete restore (they will overnight a drive for something like $200, but it can take time to prepare it), but it's completely automatic and very lightweight. It works well on Macs and Windows machines. (I also use acronis locally for a Windows machine that I like to abuse, never used their mac products). Backblaze also supports versioning, local encryption (i.e. they don't have your keys), and works from any internet connection, great for laptops.

CrashPlan is more expensive for business versions but they do have the advantage that you can seed your initial backup by sending them a drive.

I have never had a positive experience with Backup Exec (or Symantec anything at all), or Time Machine with anything more than a few machines.

quadruplebucky
  • 5,041
  • 18
  • 23
2

If I were you, I'd use network home folders over NFS or AFP and have a standardized image built from something like Deploy Studio or Apple's built-in deployment solution.

When a laptop fails, all of the data and user state is safe on your server (which is being backed up by something more enterprisey than Time Capsule, hopefully) and you can lay down a fresh image on fresh hardware and not think about it. Of course, this has some prerequisites that many smaller dev shops scoff at, such as Open Directory or Active Directory (unless you want to configure it all by hand).

MDMarra
  • 100,183
  • 32
  • 195
  • 326
0

If you want to try running Time Machine against a file server, you can run netatalk on generic *nix to get the required afp protocol support.

--

a quick tip to make VM backups less painful regardless of your backup strategy.

Make regular snapshots of the VMs. Work from snapshots instead of the original. this way the original disk files won't be changed.

Alternately/additionally, make the VMs be dataless, and revert them to the snapshot state after each run. Store files that will be changing on a fileserver. VMware has a bundled samba you can use for sharing folders from the host; If VirtualBox doesn't, you can install your own samba if needed.

You can script this stuff up to make it quick & easy to start & stop your VMs. VMware, again, has command line options to the vmrun program (at the core of the app, look around with ps and you'll see it). you can do stuff like:

vmrun stop "/Users/foobar/Documents/VMs/win7.vmwarevm/win7.vmx" hard

which will kill the running VM, and revert to the snapshot.

Poke around and I bet you will find similar stuff with virtualbox.

--

One other thing you might try is BackupPC. It uses rsync or tar over ssh as a transport, and does file-level deduplication on the back end. I've used it for years with linux clients.

The only trick with Macs is that you need to be sure you are getting whatever mac-filesystem-specific stuff you need. Resource Forks, etc. People on the mailing list have reported success with "Xtar", a tar extended for osx. In your case you probably don't have any of these, but make sure.

Dan Pritts
  • 3,181
  • 25
  • 27
  • 4
    I've done this, and it's notoriously unreliable. About once a week, a full backup would need to be re-created and the previous backup set is broken. This is a common complaint amongst people that roll their own time capsule storage. Network targets for time machine backups that aren't an OS X server or a Time Capsule are sketchy at best. – MDMarra Feb 13 '14 at 20:04
  • Ouch. I ran into this a couple time and chalked it up to my flakey old linux box. – Dan Pritts Feb 14 '14 at 20:28
  • I ran into trouble with this too. It is part of why I ended up going with CrashPlan (I also wanted off-site backups for my personal files). – Joe Block Feb 15 '14 at 18:11
0

I took an unconventional approach by setting up GIT to push to a private remote server and running it through a script and cronjob.

It obviously doesn't handle ACL, but the "repair permissions" command in the disk utility works fine for this.

Twitch
  • 101
  • 1
  • 9
  • What are you including in your GIT? The entire home directory tree? – ewwhite Feb 20 '14 at 21:21
  • Yes, home directory and any other specific directories. I'm not particularly worried about backing up the entire system as much as preserving specific work. And GIT is something a lot of developers are already familiar with. – Twitch Feb 20 '14 at 21:25