4

Background

So I have my Google Drive 1TB account. Obviously, I'm using Drive to have a backup for all my precious files.

I used to have a local Drive folder with auto sync on my PC. It was very convenient to sync folders and files transparently.

Then, when ransomware awareness kicked in, I got a little paranoid about getting my Drive infected and making my drive backup futile.

I really want to install my local Drive folder again, but the paranoia still exits.


How to resolve this?

Approach 1

Disable auto-sync. Sync only when necessary. This approach kind of suck. It doesn't completely resolve the issue. You still have a sync window that might get infected during. In addition, manual sync is not ideal.


Approach 2

As suggested in this post: How Ransomware Locks Files on Google Drive, use Backupify for G Suite.

Cons:

  1. Not free.
  2. I still don't understand how this resolve the problem? How many copies of your data do they save? What if I'm aware of the ransomeware weeks/months later? Do I still have my copy?

Question

Any more suggestions/ideas?

I believe this issue is applicable to millions of us out there.

idanshmu
  • 141
  • 3
  • you have 30 days of free revisions in gdrive, and your gdrive folder itself won't get locked out, so you should be able to recover all your files for a month, and all your non-modified ones in perpetuity. – dandavis Oct 21 '17 at 08:11

1 Answers1

2

Let's address the approaches you highlighted first

Approach 1:

  • As you stated, this doesn't resolve, nor prevent your files from getting encrypted if you're hit. It only creates more problems in the sense that you will end up with considerably less backups and be open to potentially more data loss.

Approach 2

  • You do not need to use Backupify, they were just suggesting their product. You can very well create your own file server (hosted or on-premise) and set up the same level of retention they offer. Beware of security however as they will undoubtedly surpass you in this area.

  • As you rightly noted this also doesn't resolve the problem as you will need to restore a backup from before your files got encrypted and this is always guesswork (unless you can forensically or otherwise determine the date the attack started).

Now this is what it all boils down to: Where services like Backupify offer value is in the length of retention. In your case, Let's assume you have used up 700GB of your 1TB drive (cos it can't be all used up right?). It wouldn't usually take months for most variants of ransomware to encrypt that amount of data but it could take up to 4 months for one of the slowest variants tested - see https://blog.barkly.com/how-fast-does-ransomware-encrypt-files.

Also bear in mind what hardware you are using, you might have 16gb of RAM and a quad core processor coupled with an SSD? Well that could almost halve the time to total encryption.

What can we do then?

The answer is really simple, plan for the worst and hope for the best. Knowing that there will always be data loss when ransomware hits is the first thing you need to accept.

From here, you will need to take further steps to prevent an attack in the first place (AVs, etc).

Next you will need to have a remediation strategy which is almost always going to be a restore from backups or to go bitcoin shopping.

Here you'll need to determine the longest window of retention you can afford. A smart approach will be to look at tests like the one above showing the time to full encryption and matching the slowest variant - so in your case, retain backups for about 4 months.

But there is a quicker and cheaper way

This only really works with incremental and (arguably) differential backups.

What happens when you wake up and review your backup logs to see that files you did not change have been backed up? Loads of them? - You immediately stop any further backups and roll back as quickly as you can!

Here you can afford a retention rate as low as a month provided you are diligent in monitoring your logs (and can remember stuff you've changed).

So there is my idea - monitor your backups daily and you will be able to tell when you have been hit and should be able to minimise data loss.

(Now this will be easy for you to implement / carry out a lone user but very difficult to replicate in a work environment... for that, we are still stuck at the "educating-the-users" stage).

tofwiz
  • 31
  • 1