Can I rely on S3 to keep my data secure?

4

I want to back up sensitive personal data to S3 via an rsync-style interface. I'm currently using s3cmd - a great tool - but it doesn't yet support encrypted syncs. This means that while my data is encrypted (via SSL) during transfer, it's stored on their end unencrypted.

I want to know if this is a big deal.

The S3 FAQ says "Amazon S3 uses proven cryptographic methods to authenticate users... If you would like extra security, there is no restriction on encrypting your data before storing it in Amazon S3."

Why would I like extra security? Is there some way my buckets could be opened to prying eyes without my knowing? Or are they just trying to save you when you accidentally change your ACLs and make your buckets world-readable?

Jamie Hale

Posted 2010-02-24T21:06:21.167

Reputation: 143

Answers

6

With unencrypted data you can never be 100% sure that some rogue sysadmin over at amazon does not peek at your data. And you can be sure that they will hand over your data if the authorities asks for it.

If you want to be 100% secure, encrypt locally before uploading.

Nifle

Posted 2010-02-24T21:06:21.167

Reputation: 31 337

That's what I figure. There just doesn't seem to be a nice way to do it from my Linux server. – Jamie Hale – 2010-02-25T00:51:11.123

These days, s3cmd has built-in support for local GnuPG encryption before upload and after download. – a CVn – 2012-12-22T15:28:52.743

1

You can encrypt your data before uploading to S3.

Here is document about security

http://s3.amazonaws.com/aws_blog/AWS_Security_Whitepaper_2008_09.pdf


Jungle Disk have good solution for store data in cloud with encryption. - http://jungledisk.com/

MicTech

Posted 2010-02-24T21:06:21.167

Reputation: 9 888

Thanks. I read that paper earlier and it certainly sounds like they try to keep it safe. I used Jungledisk until they went with the service license - then I moved on. – Jamie Hale – 2010-02-25T00:49:59.673

Also, at the moment I have 3G in 3000 files. Encrypting locally will work if it can be automated smartly. I just haven't seen a good solution yet. See the s3cmd pages for some great discussion on the problems involved. – Jamie Hale – 2010-02-25T00:52:43.060

1

Look into Super flexible file synchronizer. This is one nifty app that is a file/sync and backup app. Works very well with S3 as well.

It will compress (using zip standard) and encrypt (AES 256) on the fly before uploading to a bucket.

As everybody knows, uploading to any cloud service changes the modified date of the file. SUpeflexible "saves" date modified information as part of the file name so that when you restore the file the original date is restored as well.

It tackles versioning and partial (block) updating of large files. Great for SQL databases and Outlook.pst files (untested).

Also breaks down large files in manageable chunks.

The restore wizward hides most of the redundant information (chunks & date info in file name).

It is not a browser.

It is powerfull so it is not for the faint-hearted. UI is good and their implementation of profiles is one of the best I have seen.

Best of all, you don't need Superflexible to decrypt your files say in a few years as Winrar (tested), Winzip or similar (not tested) does/should decrypt independently.

The program also handles backups across lans, usb, other ftp sites etc. It is extremely fast comparing remote/local files. I have been using it for about four years for local backups and now recently for S3.

bold

Posted 2010-02-24T21:06:21.167

Reputation: 11

1

Use duplicity. It offers bandwidth-efficient backups to a variety of different backends, including Amazon S3 and the option to armor the backup using GPG. duplicity is also a snap to use:

duplicity --encrypt-key 'your gpg key' myFileOrFolder s3+http://myBucket/where/I/want/it/to/live

Just make sure you don't loose the GPG keys or your data will be sealed away forever.

Sharpie

Posted 2010-02-24T21:06:21.167

Reputation: 113