This isn't a cheap solution, but if your buckets really are critical, here's how you do it: boot an Amazon EC2 instance and sync the content there periodically.
Amazon EC2 is their virtualization hosting provider. You can spin up instances of Linux, Windows, etc and run anything you want. You pay by the hour, and you get a pretty big storage space locally for that server. For example, I use the "large" size instance, which comes with 850GB of local disk space.
The cool part is that it's on the same network as S3, and you get unlimited transfers between S3 and EC2. I use the $20 Jungle Disk software on a Windows EC2 instance, which lets me access my S3 buckets as if they were local disk folders. Then I can do scheduled batch files to copy stuff out of S3 and onto my local EC2 disk space. You can automate it to keep hourly backups if you want, or if you want to gamble, set up JungleDisk (or its Linux equivalents) to sync once an hour or so. If someone deletes a file, you've got at least a few minutes to get it back from EC2. I'd recommend the regular scripted backups though - it's easy to keep a few days of backups if you're compressing them onto an 850GB volume.
This is really useful for SQL Server log shipping, but I can see how it'd accomplish your objective too.