2

Currently we have about 20 Mac OS 10.9 MacBook Pros (almost all with SSDs) backing up to individual USB drives. I'd like to consolidate these to one drobo thunderbolt drive array attached to a Mac Mini server (running 10.9 server) using time machine server.

My question is, will this scale to 20 users? Examples I have seen seem to be 5 or 6 users tops, and this isn't easy for me to test (I'd rather not ask everyone to backup to the array and then switch back to USB drives if it brings our network to its knees). My primary concern is saturating our gigabit network, as time machine backs up every hour for every machine, so there would usually be a couple people backing up at any given time. We also have some people occasionally on our 802.11ac network and not on ethernet (usually connected via 802.11n until people upgrade to newer machines), but most of the time people are connected to our thunderbolt displays which have a gigabit ethernet connection on them.

Our network topology is one 32 port gigabit switch with 5 smaller gigabit switches at each desk cluster. The mac mini server is connected directly to the top level switch.

Update: Failing information from someone who has done this in practice, I suppose my question is really around how switches work. If three or four people are backing up simultaneously, and then other two (different) users transfer a file between each other, will they be able to transfer the file at gigabit speeds?

user197609
  • 23
  • 3
  • Do you have access to the administrator of a 5-6 user example? You could ask him / her to fire up Activity Monitor on the server and see how much network traffic there is. I think it will work find as long as you seed initial backups one or two at a time to avoid the storm... scariest part of your plan is the word "Drobo." – Skyhawk Nov 07 '13 at 16:54
  • I don't have access to an administrator. Why is Drobo scary? – user197609 Nov 08 '13 at 21:37
  • The worst feature of a Drobo is unexplained total data loss. Read some 1-star product reviews, note the percentage of reviews falling into that category, and decide for yourself. In terms of LAN bandwidth your limiting factor is the gigabit link from the main switch to the Mini... so you can move data at 1Gbps altogether. However, keep in mind that even gigabit ethernet is in practice about 4x as fast as USB 2.0, which is perfectly adequate for Time Machine backups, so, in theory, what's the harm in a little sharing? – Skyhawk Nov 09 '13 at 00:52

1 Answers1

2

It depends on the clients' file usage, but in general with a gigabit network you shouldn't have any trouble. Time Machine does incremental backups, so it's only moving the data for new/changed files each time it backs up. Also, it uses FSEvents to track file changes on the client side, so it doesn't (usually) even have to scan the backup to see what's different.

However, it's a file-level incremental backup, so if any clients have large files that change often (databases, VM containers, video render scratch files, etc), they'll eat both bandwidth and server capacity copying the entire file every hour.

A few recommendations, though:

  • In many cases, it makes sense to exclude the /Applications, /Library, and /System folders from the backup (and if you exclude /System, it gives you the option to exclude all system files and applications -- basically, the hidden unix portions of the OS). This'll save space on the backup. If a client dies you won't be able to do a bare-metal restore, but you can reinstall the OS (or reimage it), then use Migration Assistant to recover the entire user account from backup, then reinstall apps and third party software.

  • The initial snapshot will effectively be a full backup, so I wouldn't set it up on all clients at once; stagger them. Also, backing up over Wi-Fi is generally fine, but do the initial snapshot over gigabit.

  • Time Machine now supports multiple targets, so you can leave the USB backups in place and simply add the server as a backup target. It seems to be pretty smart about rotating between targets when several are available, or just going with what's available if not. I'm a great believer in backup diversity, and this gives you at least a bit of diversity.

  • If it does turn out to be a problem (and I'm fairly sure it won't), there are (unsupported but pretty safe) ways to adjust the backup schedule. See TimeMachineEditor or TimeMachineScheduler.

Gordon Davisson
  • 11,036
  • 3
  • 27
  • 33
  • Thanks. What would you estimate is the largest practical limit for number of typical users doing hourly backups? We do software development, so there are no large video files for example being backed up. – user197609 Nov 08 '13 at 22:50
  • I don't have any experience with large installations, but as long as the hourly changes are pretty small, I'd expect it to scale to hundreds of clients without too much trouble. Don't let the "hourly" aspect scare you: it's doing about the same amount of total work as a daily snapshot, but spread out into a bunch of small updates so it's actually less disruptive than a daily backup would be. Unless you're doing full backups and (for instance) everyone installs the 10.9.1 update at the same time... – Gordon Davisson Nov 09 '13 at 04:15