In general I think making backups as automatic as possible will ensure that you always get the level of backup you need. You also want some level of redundancy both on-site and off-site. If the data doesn't exist in two places at once, then it really doesn't exist.
I would recommend the following approaches for maximum redundancy:
Duplicate your important data locally
This can be done using a couple of methods
- Mirror your drive(s) automatically with RAID-1
- Automatically clone your drive(s) using an rsync style backup script, or other application (TimeMachine, Carbon Copy Cloner, etc). I would also prefer a backup application which does rolling snapshots (like Apple's Time Machine or rsnapshot)
- Store all your important data on a dedicated NAS with some sort of dynamically expandable RAID (like a Drobo, or a NetGear ReadyNAS. Use an application (like those listed in method 2) to automatically copy data from your various systems and drives to the NAS on at least a daily basis
Backup your important data offsite
I would recommend using a 'cloud' based backup solution that is automatic. These include:
- Mozy
- Carbonite
- Backblaze
These services automatically upload unlimited data to their servers for $4.95 a month. Some of these services even give you 30 days of snapshots so you can get back to data you accidentally deleted.
Mirror your system drive
If you really want to get crazy paranoid about losing your data, I would also mirror your system drive daily using a mirroring tool like TrueImage, CarbonCopyCloner, SuperDuper. This will give you an exact copy of your system drive, so if your drive fails, you can just pop in the cloned drive and pick up where you left off. No time wasted rebuilding your system, installing old applications, etc.