72

I have been reading a lot here about Ransomware attacks and I am wondering if my strategy for protecting myself is valid or not.

I have 10Gb of personal data and 90Gb of photos and videos. I have them in D:\ drive in two separate folders. Personal data folder is synced with Google Drive. Photos are synced with a similar tool (Hubic). This way every new photo I copy to D:\ drive is soon sent to Cloud Storage. If my hard drive dies or is stolen I still have my online copy.

But in case I suffer a Ransomware attack, I am thinking it might not be good as possibly the data would be deleted/encrypted also in Google Drive. So my question is:

  • Is my method of syncing my data to online storage services (Google Drive, Dropbox, etc.) a good way to protect myself against Ransomware?

  • Is there a better backup strategy for ensuring I can recover from Ransomware?

Note: There is a similar question here but it focuses on if the online storage vendor can be trusted or not. In my case I choose to trust them, so, given a successful Ransomware attack, would I have a backup to ignore Ransomware demands.

Oscar Foley
  • 850
  • 1
  • 7
  • 12
  • 27
    Just make a Git repository "MyComputer" and commit every time you go to sleep. – Tomáš Zato - Reinstate Monica Apr 18 '16 at 11:50
  • Of course I am doing my best to avoid this happening. This question is about a second line of defense... – Oscar Foley Apr 18 '16 at 12:28
  • 1
    Besides trying to educate them, my backup solution for the whole family is based on daily rsync to a linux server with btrfs, so even in the case they get infected, at most the work from the current day is lost. – PlasmaHH Apr 18 '16 at 12:52
  • 18
    @TomášZato Theoretically an awesome idea, but: You will loose permissions, it'll take up a lot of space, the commit time will be ages and git is really bad with large, binary files _especially_ when there are a lot of them. I know because I wanted to. – Sebb Apr 18 '16 at 13:31
  • @Sebb Frankly I'd add my movies folder to `.gitignore`, same with software - it can be infected anyway so why back it up. The only issue might be a database - that's a big file that regularly changes. But there are other solutions for backing up databases, so I would `.gitignore` it too. My proposal of course isn't ideal solution but would probably work on documents and photos quite well. How often do you edit your photos? – Tomáš Zato - Reinstate Monica Apr 18 '16 at 13:41
  • Wouldn't every answer be different for each online storage provider? Whether they're "safe" from ransomware and to what extent would depend on their backup policies surely? – Lilienthal Apr 18 '16 at 13:46
  • 4
    @TomášZato But if you hardly edit it, you may as well do two offline copies without a loss. And when you have to deploy several backup solutions anyway, you can simply use one which is intended for whole PC backups instead of small text files. I personally have 95% of my code (which is basically what I do) over at my git server or github, so I am _kinda_ using it, but it doesn't work as all-arround for backing up machines. I use [rsnapshot](http://rsnapshot.org/), in case you wonder and/or are searching for a similar solutions without the downsides. – Sebb Apr 18 '16 at 14:03
  • If you use a git repository, you would still be vulnerable to a super-smart ransomware which encrypted your .git folder, and also did a destructive git push. Since cloud storage also often versions files it may not be any better, really. – Ben Apr 18 '16 at 18:06
  • 14
    @TomášZato The fact that you're arguing for excluding most of your important files from the git repository only reinforces the idea that git is not an ideal choice for backups. – user2752467 Apr 18 '16 at 18:25
  • 1
    @JustinLardinois Videos and software are not important part of my files. Documents are. – Tomáš Zato - Reinstate Monica Apr 19 '16 at 07:33
  • 3
    @TomášZato Two issues with that. 1) When restoring a backup, I really want it to be as painless as possible, and not be worrying about whether I missed any documents. 2) For plenty of people, videos and software *are* important to them. I'm not going to store all my family videos in git. And anyway, why would I do this when there's real backup solutions out there that don't have any of the downsides? – Chris Hayes Apr 19 '16 at 08:43
  • 1
    Guys you're taking it too serious. The firs comment was not meant that seriously, if it was, I would post it as an answer. – Tomáš Zato - Reinstate Monica Apr 19 '16 at 12:07
  • I made this a few years ago: [Comparison of Cloud-Storage Backup Solutions](https://docs.google.com/spreadsheets/d/1i51uwM0J6A0ieDPd5xYzXvvVHmWLSENZovI_jHIbfIY/edit?usp=sharing). I'm sure some of it is out-of-date _(especially the prices)_ but it should still be useful. I ended up going with SpiderOak. – BlueRaja - Danny Pflughoeft Apr 19 '16 at 16:12
  • 1
    You can always use a "mist" - a cloud that's in your house :) Get a local server and back up over LAN to it. Fast and painless. Alternatively, write backup script and use external disc. – MatthewRock Apr 20 '16 at 10:00
  • No automatic backup is a good strategy: when ransomware encrypts your files, the backups will get overwritten by encrypted versions. – Agent_L Apr 20 '16 at 13:07
  • If you're looking for something like git but without disadvantages, there's [bup](https://github.com/bup/bup) – Farid Nouri Neshat Apr 21 '16 at 15:12

15 Answers15

43

I'm not sure about Google Drive, but Dropbox provides a way to recover previous file versions, a feature that wouldn't be impacted by the ransomware, since it relies on a file copies on the Dropbox servers. So it'd certainly be a way of protecting your data.

However, recovering everything over your internet connection is a relatively slow process. Personally, I would use a NAS device, but wouldn't map it as a network drive (because those can - and will be influenced if a ransomware is activated on your computer). I would use it via FTP / SFTP, probably with a script that syncs the files on a regular basis. This way you have the files locally, which makes restoring from an attack less of a problem. It is probably cheaper too.

Also, if you prefer Dropbox-like experience, you might want to try ownCloud on your own device - it also keeps the previous versions of file, allowing you to roll back in case of file damage or corruption. Keep in mind that storing multiple old versions of a file takes space on your NAS's disk(s).

Jakub
  • 840
  • 7
  • 11
  • 6
    Wouldn't that regular sync via SFTP corrupt the backup if the original files are infected before regular sync? – Oscar Foley Apr 18 '16 at 09:53
  • Depends on the configuration - in most basic config (replacing old versions with new ones) - yes, files would get corrupted. You can configure the sync in other ways though, e.g. to ask you to confirm sync when many files are changed (ransomware attack usually changes thousands of files, which is quite noticeable), or to implement version rotation. – Jakub Apr 18 '16 at 09:58
  • To be fair I asked in general for Online Storage solutions and later used as example the ones I am using (Google Drive and Hubic)... – Oscar Foley Apr 18 '16 at 09:58
  • And that's why I am rather willing to recommend running ownCloud on OP's own hardware. That gives freedom of configuring it (e.g. making sure the versioning won't suddenly stop working at some point). NAS'es that support ownCloud (or proprietary solutions, like for example Synology's Cloud Station) aren't overly expensive, and pretty much all of the multi-disk devices support RAID. It also ensures trustworthyness of the service provider, since you become your own service provider. – Jakub Apr 18 '16 at 10:42
  • 28
    BTW I had to recover some encrypted files from dropbox for a friend. So I can confirm that 1) ransomware *do* encrypt dropbox folders but 2) Dropbox revision system allows to easily revert all those changes. Note however that dropbox isn't simply a shared filesystem. It provides quite extensive APIs and a dropbox aware ransomware *may* be able to remove old revisions using the API. Currently it seems like ransomware aren't treating dropbox folders in a special way and it is trivial to revert the changes. – Bakuriu Apr 18 '16 at 11:22
  • Isn't there a limit to how much Dropbox will recover (for free)? – Lilienthal Apr 18 '16 at 13:35
  • 2
    Yes, Dropbox limits the time during which you can recover your files (30 days for Free version, a year for Pro "Extended Version History" and unlimited time for Business. As for the number of old version they store - I can't seem to find information on that. Good thing though is that previous versions do not count towards your storage quota. – Jakub Apr 18 '16 at 13:41
  • 11
    Ransomware WILL EVOLVE and introduce more attack vectors (as we witnessed in 2015-2016), you still have a single point of failure - a recipe for disaster. – Enis P. Aginić Apr 18 '16 at 14:48
  • 4
    FYI - Google Drive also keeps [previous revisions of files](https://support.google.com/drive/answer/2409045?hl=en) (including non-Google files). However, it is a little unclear as to how long those previous revisions are retained. [Another answer](https://support.google.com/docs/answer/190843?hl=en) claims `The revisions for your file may occasionally be merged to save storage space. This can happen due to the age of the file or the large size of certain revisions.` – FGreg Apr 18 '16 at 18:00
  • "Personally, I would use a NAS device, but wouldn't map it as a network drive (because those can - and will be influenced if a ransomware is activated on your computer)" Note that NAS devices are available that will automatically create periodic snapshots, so you can roll back to a previous one even if the contents of your network drives are wiped. – Jules Apr 18 '16 at 20:09
  • Isn't it true that with Google Drive you have to use the web interface to manually restore one file at a time from its previous version? Can be quite a hassle for people who have thousands of files. – Fiksdal Apr 19 '16 at 08:25
  • A "write only" NAS, that can only have versions of files deleted from it if a physical button is pressed would be one solution. – Ian Ringrose Apr 20 '16 at 13:28
  • 1
    While not impossible, use of the Dropbox API *does* require 1) that the application is registered as an app with Dropbox, and 2) you authorize the app to use things. So the ransomware would need to authenticate as you to dropbox and manage the OAuth dance. Definitely possible, but it doesn't seem like very low-hanging fruit, as far as that goes. – Wayne Werner Apr 20 '16 at 21:02
  • The time taken to restore from a remote service is only relevant when you actually need to restore, and this should (hopefully) be infrequent enough to be tolerable. It also has the advantage that it protects you against other risks - such as the building that your computer and NAS are both in burning down. – James_pic Apr 21 '16 at 11:15
21

Simple, cheap and relatively scalable solution
(Although I'm aware it has nothing with online storage to do)

I have two USB drives that I rotate regularly (you can add a reminder in your calendar if you're afraid to do so). You can use one of the many synchronization tools to choose which folders should be copied, I use Allway Sync.

One of the drive is always offline. You could even move it to another location to make your data resilient to burglars visiting your home or fire or whatever.

You can encrypt the drive too if you don't want other people to tamper with your backup. I use VeraCrypt.

Notes

The more often you rotate your drives, the least data you will lose in case of ransomeware infection. But of course that's the downside of this solution, that you need to manually rotate your drives every now and then.
But it's a cheap, flexible and effective solution against many problems that could occur.

A simple solution

To me, it matters to keep the solution simple so that it cannot be misused. For example, the NAS solution will only work if no one ever mounts the drives. I can easily see how this could fail with unexperienced user that don't know exactly what they do.
Plan for that day when you sit and try to solve a problem. One solution is to mount the drives and you totally forgot about your backup scheme that you set up several months ago.

Thibault D.
  • 465
  • 2
  • 8
  • 8
    `One of the drive is always offline`. Then ransomware programmers would take another strategy and ask ransom after few days (both cooldisks infected). Then what? what if cool disk damages or misses? Online backup is a good solution but just only for whom the ratio of Sensitive Data/Bandwidth is appropriate. – Xaqron Apr 18 '16 at 11:20
  • 4
    The ransomware that affected Mac OS X via Transmission did wait a few days before it started to encrypt. But encrypt and wait a few days before asking for the ransom? Without you noticing that you cannot open your files anymore? I'm assuming it's hard to keep it a secret that files were encrypted during many days. Maybe I'm wrong. Whatever the case, this a flexible solution and no one forbids you to have more drives with different rotation delays. Like any solution this one is not 100% foolproof but you can always make it more or less resilient. – Thibault D. Apr 18 '16 at 11:25
  • You should think ahead. If they are gonna apply that version then simply can show your files before removing the key from your computer (currently don't get it on your PC then they would). Among discussed solutions online storage and versioning are good but as you said each one has their own pros and cons. I don't trust any solution for an infected PC. SO first choice is avoid infection. – Xaqron Apr 18 '16 at 11:31
  • 6
    @ThibaultD.: if the ransomware implemented the encryption using a filesystem layer encryption, like eCryptfs, then it may be possible for you to not notice that the file is encrypted for a long time, until the ransomware discards the encryption key or you restart your machine as your filesystem still appears to function normally. – Lie Ryan Apr 18 '16 at 12:21
  • @LieRyan has any ransomware in the wild been shown to do that? The immediate problem is that while the ransomware is in stealth mode, the decryption key is local and can be grabbed by someone with basic protection in place, foiling the plan. Successful ransomware strategies to date all involve one-time one-way public key encryption so that the private key is never near the user, and can surely be ransomed. – Jeff Meden Apr 18 '16 at 15:34
  • 1
    If you rotate "hot-swappable" disks, you would likely have a third that stores for longer intervals and lives offsite. – Raystafarian Apr 18 '16 at 17:32
  • 1
    Plus you could do incremental/differential backups, so that you could still recover any files that got backed up while already encrypted, by going back a little further in history. – Ben Apr 18 '16 at 18:11
  • 1
    @JeffMeden yes, it has happened, although focused on servers. The ransomware provided a decrypted view of the filesystem / database for some time, communicating with a C&C that provided the key material for decryption. After some time elapsed and backups were expected to have rotated, they disable your view. When the victim goes to his backups, discovers that the infection happened months ago. – Ángel Apr 18 '16 at 21:55
  • Encryption to protect against encryption. I like it. – WillS Apr 18 '16 at 23:29
  • 1
    @WillS I mostly wanted to suggest a simple solution and encryption is not part of the scheme against ransomware in this solution. The strength of this scheme is the offline copy, although it's been argued in the comments whether it's really safe. The encryption is optional here and meant to deal with the fact that you'll have backups of your data on very insecure support (in a way that other people could easily access it). – Thibault D. Apr 19 '16 at 06:10
  • 1
    @Ángel what was the ransomware name? And how do you not notice database performance tanking when all IO is going through CPU decryption? – Jeff Meden Apr 19 '16 at 12:28
  • @Jeff Meden, Ángel is likely talking about this incident: (http://www.theregister.co.uk/2015/02/03/web_ransomware_scum_now_lay_waste_to_your_backups/). This, however, was not a "regular" ransomware, but a very targeted attack relying on a stolen FTP password and a specific setup, not just somebody opening an e-mail attachment – Maxim Apr 19 '16 at 21:06
  • Thanks for finding it out @Maxim, I was unable to find a piece of news for that when commenting. It is not the _usual_ ransomware, as it was targetting webservers. I should note however, that it could have spreaded as well through a CMS vulnerability (recently there were some "light" [website ransomware](https://blog.sucuri.net/2016/01/ransomware-strikes-websites.html) infections using remote-execution CVEs) – Ángel Apr 19 '16 at 23:33
  • @Xaqron Don't most backup applications have a means of verifying backup integrity? You could just use a separate machine (perhaps always disconnected from a network) to verify the integrity of the hot drive just before rotating it out. – alexw May 22 '17 at 16:18
15

At the time of writing, Dropbox would be a good way to mitigate ransomware attacks because a 30 day version history of file changes is kept on their servers (even on the free tier).

This, depending on the volume of data, requires a fast internet connection for both upload and download for it to be effective.

However, (big caveat) it wouldn't take much for new ransomware to be engineered that grabs your session token from Dropbox.com, or that installs a keylogger in order to capture your cloud provider password and then proceeds to select the "Permanently Delete" option, rendering the files irrecoverable.

The same goes for any online storage options, whether mapped as a drive or not, as ransomware could easily be engineered to search out SMB shares on the local network and encrypt files there too. The only real option for online backup would be if you had a write only option where the network protocol will allow new files, changes and deletions only with full version control and no possible way to disable the version control or to permanently delete past copies.

This then leaves offline backups as the final option. These would have to be manually initiated to removable media, which would be best stored encrypted off-site for protection against non-malware threats (e.g. fire or theft).

SilverlightFox
  • 33,408
  • 6
  • 67
  • 178
  • 1
    "The same goes for any online storage options, whether mapped as a drive or not, as ransomware could easily be engineered to search out SMB shares on the local network and encrypt files there too. " My local network fileserver uses a ZFS filesystem which takes daily snapshots and deletes them after a week, so unless I don't notice the ransomware for a whole week I can just roll it back I don't know if there are commercial-off-the-shelf storage systems that do this, but I know some are based on ZFS and this is a pretty simple addition, so I'd be surprised if at least one couldn't do it. – Jules Apr 18 '16 at 19:58
  • 2
    A quick search suggests that, yes, this feature is readily available on low-cost commercial NAS servers (e.g. [the one described in this knownledge base article](http://kb.netgear.com/app/answers/detail/a_id/23353/~/what-are-basic-snapshot-concepts-i-need-to-understand-before-operating-my)). – Jules Apr 18 '16 at 20:05
  • My next sentence covered that scenario - as long as the network protocol doesn't allow version deletions then you're fine. – SilverlightFox Apr 18 '16 at 20:08
  • Ah, yeah, I thought you were talking about setups where once a file is written it can'd be modified at all, rather than a behind-the-scenes snapshot being taken. – Jules Apr 18 '16 at 21:11
15

What would you recommend as backup strategy to avoid Ransomware?

Read Only Storage

The simplest solution covers 90% of the average person's data preservation needs: store your old data in a read-only format. How much of your data is old tax information, resources from past schools/jobs, photos from vacations, or any other type of information that isn't going to change?

A common, DVD-R stores nearly 5 GB for a few bucks. In addition to storing your information on Dropbox or an external USB, just throw last year's stuff on a disc on January 1 and Sharpie the year on the top. Continue to back up drives regularly in whatever way is convenient, but a physical "checkpoint" in a filing cabinet is never a bad thing.

For professionals in charge of large amount of business data, apparently, 1 TB optical storage is on the way, although frequent network backups are still necessary when even a few days worth of data (code development, business contracts, professional photography shoots) could be worth a lot of money.

user1717828
  • 2,392
  • 13
  • 19
  • 7
    Note that you should periodically check optical backups to be sure they're still working. DVD-R has a theoretical lifespan that's somewhere in the high tens of years, but unless kept in optimal conditions it might not make it anywhere near that long. – Jules Apr 18 '16 at 20:03
  • 2
    The price for a single 4.7(?) GB DVD+-R disk should be closer to 25 cents US. A BluRay -R disk (25GB or 50GB DL?) should be a few bucks – Xen2050 Apr 19 '16 at 23:20
  • @Xen2050, I'm pretty sure it costs more than— [OHH SHHH-NEVERMIND](http://www.amazon.com/dp/B00081A2KY). – user1717828 Apr 20 '16 at 11:56
  • you can get "M-Disc" dvds that are purported to last much, much longer than standard ones, which as @Jules says are not that reliable. http://www.extremetech.com/computing/92286-m-disc-is-a-dvd-made-out-of-stone-that-lasts-1000-years They are writeable in at least some standard burners. – Dan Pritts Apr 20 '16 at 17:30
  • @Jules, I guess you should periodically re-burn the disks? Checking to see if you lost data means it's already too late, right? – JPhi1618 Apr 21 '16 at 14:25
  • @JPhi1618, Meh, this is a consumer-grade solution to a consumer-grade question, as specified in the first sentence (professional usage addressed at the end). *Periodically check backups*? In reality, consumers ain't got time for that. If it's that important where you want it to last 100 years without being used, pay to [put it on cassette](http://www.economist.com/blogs/babbage/2013/09/information-storage), otherwise just deal with an affordable, suboptimal compromise. – user1717828 Apr 21 '16 at 14:43
14

It's kind of scary that only one answer here mentions the verification of the backups so I felt the urge to add this answer:

Whatever you chose as a backup strategy: Your backups are worth absolutely nothing if you don't have a working and well tested verification mechanism to check the integrity of the files.

It's just a matter of time until ransomware will attempt to decrypt files on the fly when you are trying to access them. After a certain amount of time the malware will then delete the local copy of the key(s) and will render many or most of your backups worthless. There are already attacks like this reported on databases of webservers.

Despite that: Personally I would never upload any sensitive data to any cloud service without encrypting it beforehand. (Yup that's kind of ironic) Remember: "cloud" is just a synonym for "another guys server".

Noir
  • 2,523
  • 13
  • 23
  • Scary article! I'd hate for backups to be encrypted transparently. So for home use, I'd guess it would be best to run the backup software off separate bootable media, and sanity-check the files included in any incremental backup if possible, as a mitigation. Although I guess if the ransomware only encrypts when you write anyway you'd still likely miss it... – Ben Apr 19 '16 at 11:00
  • There is nothing wrong about encryption, as long as **you** have the decryption key. In fact, I would argue the same as you that encryption before uploading to the cloud should be the default, to be disabled only in specific cases where local encryption simply isn't feasible for the use case in question. – user Apr 19 '16 at 20:14
  • @MichaelKjörling like Mozy does (and with your own key) - but I'd say you want not just encryption but delta-fication and historical versioning too, the least amount of data sent to the cloud is a good thing for all (and easily defeats MitM attacks once the initial data is uploaded, if all they get is a few binary diffs) – gbjbaanb Apr 20 '16 at 15:34
8

No. Consumer grade cloud backup is not an effective solution. In fact no single solution will protect your data, you must mix it up a bit.

To give you a good answer I would have to know about your habits, usage patterns, and a lot of other details, but here is my best guess based on an average home/small business owner I'm usually working with.

So, to backup, or to be exact archiving.

It's a very complex question and you should decide how much you can afford to lose. Providing a 99,9% data security is a VERY expensive affair (think redundant geographically scattered storage with no single point of failure). Data can get lost in many more ways than you think, not just ransomware. For example DVD or BR-D will only last a few years, flash drive will be dead in around 7 years, typical hard drive is not usable after 5 years, format may get deprecated, interface may get deprecated, hard drives may make uncorrectable errors (and in fact they do), your backup may be killed by lightning strike, fire, flood, it may get stolen, you could lose your password if you encrypt (and you should)... Just imagine a nightmare scenario where you have NTBackup archive on a failing IDE hard drive - fun.

So a few solutions:

First of all, monitor your filesystem. Ransomware attack will create huge filesystem changes and you will know there is a problem right away.

OPTION 1 - go with M-Disc. 100GB of data is not that much, so you can make two copies of it on 100GB M-Disc BDXL. Put one in a drawer at home, put one in a bank safety deposit box and you are good. For a millennia, they say. Bear in mind you can still lose your data. It's a read only medium so using it on an infected computer is not a problem. Between archiving use a full size SD Card (say 128GB), flip it's switch to read only for everyday use and to read-write when you backup. Between archiving, use a DVD until you have enough for another M-Disc archive (pay attention to DVD longevity). I'm not affiliated with M-Disc in any way, but I do have a pretty good experience using it.

They also have Dropbox + M-Disc solution on their website, so you can use Dropbox for convenience, and get your archive shipped in.

OPTION 1.1 - Same as above, but using regular Blu-Ray disc. It' cheaper but much more risky. Make sure you re-burn your archive once a year.

OPTION 2 - setup a small (Linux) file server and mount it's storage for convenience, but make sure it is versioning it's backups to a storage not accessible from your client computers (NAS or Cloud or whatever). So if something goes wrong, mounted storage will get encrypted, but you can always go back as the server itself is not infected. Firewall it not to allow remote access as future more advanced ransomware may be able to exploit it by stealing credentials from an infected client. Make sure you always have more than one copy of your data, consider longevity of the media used, and replace hard disks on a first sign of trouble.

OPTION 3 - get a credible IT guy to set up a solution tailored to your needs so you get instant access to your data and (almost) bulletproof archive. I know people come here for DIY solutions, but data protection is a science, not something you can sum up in a single page and I'm 100% sure you can't see all the caveats to your solution.

Whatever you choose there is no "set it up and forget about it" solution, and who ever claims there is, is most likely incompetent.

  • Thank you! Thus far, the only really good answer. Cheers! and +1 – Citizen Apr 19 '16 at 06:36
  • 2
    `Between archiving use a full size SD Card (say 128GB), flip it's switch to read only for everyday use and to read-write when you backup.` This switch is not really a write protection. It's just a flag for the host system that it _should_ not permit any write access. Therefore I would not rely on this in extreme cases like securing my sensitive data against malware. – Noir Apr 19 '16 at 22:30
  • @Noir You are right, future-proofing is very important part of any backup strategy, and I suggested another solution instead of potentially unsafe SD Card. – Enis P. Aginić Apr 20 '16 at 06:43
6

Keep it simple.

While cloud-sync solutions may provide protection against ransomware through file versioning, choosing individual solution requires research(1) (2) and I think it's a task not worth the hassle. Depending on a cloud service their client-functionality is different and these companies create and support their solutions mainly as a synchronisation tool rather than for backup and versioning.

  (1) Google Drive offers file versioning (30 days), but old versions count towards the space limit. Google does not seem to publish information what happens if you had a 100 GB plan and ~100 GB data that would change instantly. It could either stop syncing or sacrifice the old versions.

  (2) Dropbox offers unlimited versioning kept for 30 days in their paid-plans.

I would suggest going with a full-fledged versioning backup solution which backs up to cloud (as well as your local network destination).

I use Arq which de-dupes files in a git-like fashion(3) and AES-encrypts them before they leave the machine. Files stored in cloud or network do not morph after being backed up thus no ransomware would change their content (unless they would replace the executable, but that would be too targeted).

  (3) This means that files backed up are split and treated as immutable chunks of data. In case source files change, new data is written with old remaining until garbage collection process removed it.

Most important: it is a solution for data protection that can be tested (restore to the same machine, to another one) ahead of any disaster.

Such attitude also treats any cloud service as a mere storage space thus freeing user from having to consider subtle differences between services.

The only thing it does not offer is web access to files (because of encryption), so on has to perform a restore (and software installation in case all computers were lost), but you need to decide whether you want backup-and-protection or synchronisation-and-sharing.

techraf
  • 9,141
  • 11
  • 44
  • 62
  • Well Google drive and dropbox PC clients give you visual feedback when it's uploading things... The only way I wouldn't notice lots of gigabyte uploads was if I left my PC turned on on the wrong time... – Freedo Apr 18 '16 at 17:13
  • Yes. That is a possible protection policy: "*For security of my data I rely on my ability to spot unusual network traffic*". – techraf Apr 18 '16 at 22:54
6

I use a stack of external USB-3 hard drives, "A", "B", "C", etc that I rotate in sequence, and run an automatic nightly backup. (my computer runs 24/7 so at night it runs tasks like full backups, deep malware scans, and occasional defrags) In other words, the drive that gets written-to tonight is the oldest one in the sequence. I keep 3 of them offsite in a bank safe-deposit box which I refresh roughly once-a-week. Since I go to the bank, or the strip-mall the bank is in, regularly for other business this does not add much of a burden. (offsite storage protects against fire, theft and similar occurrences).

The only other work I have to do is, when I sit down at my computer to start my day, I have to remember to swap out the USB connector of last night's backup with the next one in the sequence, which is all habit now.

My next to-do task for this problem is to add some automatic verification that the files are readable and not encrypted. Right now I do that manually on a spot-check basis but that takes time and attention so I'd like to automate it.

Peter N
  • 61
  • 1
  • 4
    So, a standard backup rotation schedule, then. It's refreshing to see people do this very simple process. – schroeder Apr 18 '16 at 16:42
  • 3
    Instead of rotating in sequence I used to use the Tower of Hanoi Scheme. Not much gained when used only with 3 disks but take it to 4 or 5 and you gain a longer backup period. *[A set of n tapes (or other media) will allow backups for 2n-1 days before the last set is recycled. So, three tapes will give four days' worth of backups and on the fifth day Set C will be overwritten; four tapes will give eight days, and Set D is overwritten on the ninth day; five tapes will give 16 days, etc.](https://en.wikipedia.org/wiki/Backup_rotation_scheme)* – Lieven Keersmaekers Apr 19 '16 at 05:06
4

The NAS Setup is my favorite choice, for second backup you could use an offline harddisk to make a backup once a month. The NAS is nice because you can take the disks offline whenever you want OR you can do what I try to do. Switch the disks around, once a year, clean disks on the NAS (also good for speed etc). Keep the old harddisks in a good place (and label them).

If they ever get thru your first layer of security, and they infect your NAS, you can just take out the disks.

Also I use default recovery systems that help me keep backups for when such things happen.

Dakpan
  • 141
  • 3
  • 3
    If by "default recovery systems" you mean Window's Shadow Copy, you might be in for a nasty surprise at some point - many ransomware programs are able to turn it off and delete the older versions of files. – Jakub Apr 18 '16 at 13:02
4

Crashplan, a paid cloud storage provider, has a dedicated article on how its online cloud storage solution can help you recover from some ransomware attacks.
Their services could be a suited alternative for your use-case where you need to backup a large amount of data for long-term storage (thanks @GuntramBlohm).

Excerpt:

CryptoLocker and CryptoWall are a form of malware that encrypts files on your computer and demands that you pay a ransom to decrypt these files. Instead of paying the criminals behind this attack, you can use CrashPlan to restore your files from a date and time prior to the infection. This article describes how to use CrashPlan to recover your files from a CryptoLocker or CryptoWall attack.


EDIT:
As noted in comments by @Ajedi32, clever ransomware could permanently delete files from your history, making your original files unrecoverable.
Many cloud storage providers don't delete your files immediately, but rather store them in a (time-limited) trash directory. That alone is not enough, as the trash directory can be typically emptied at any time.

Clever ransomware targeting ...

mucaho
  • 378
  • 1
  • 7
  • 1
    While CryptoLocker and CryptoWall may not do this, couldn't ransomware theoretically delete your CrashPlan backups? I'm pretty sure CrashPlan does let you delete stuff from their desktop UI without requiring you to reauthenticate with your username/password. – Ajedi32 Apr 18 '16 at 19:22
  • 1
    This sounds more like an advertisement than an answer. At the very least, it should disclose if there's any affiliation between the poster and the company, and there should be an explanation on how this differs from the OP's strategy. – Guntram Blohm Apr 18 '16 at 19:37
  • 1
    @GuntramBlohm I'm in no way affiliated with that company, I'm merely highlighting one alternative cloud storage solution, that specializes in large-scale backups. I'm disappointed that you would think otherwise. – mucaho Apr 18 '16 at 20:02
  • 1
    @Ajedi32 **That can be achieved.** With the default backup settings, deleted files are kept forever. Now, if ransomware would be smart enough to change the backup settings, they could change them so that newly deleted files do not stay in the archive. Additionally, files that were deleted before the settings change, stay in archive until a maintenance is done, which can be _unfortunately_ [forced by the user](https://support.code42.com/CrashPlan/4/Configuring/Deleting_Files_From_Your_Backup_Archive#Removing_Files_From_Your_Archive_%28Optional%29). – mucaho Apr 18 '16 at 20:32
  • @GuntramBlohm **Instead of paying the criminals**, pay us. We're are not crooks, at least. – Mindwin Apr 19 '16 at 12:24
4

No solution involving "backup to cloud" code that runs SOLELY* on your "work" PC is safe.

*updated thanks to comments

Sooner or later the ransom-ware authors will start hijacking cloud-storage logins.

My solution is to share the users folders so that a 2nd highly secure Linux box somewhere (local or cloud) can read the users files and back them up to whatever the appropriate destination(s) is/are, local rw media, local read-only media or cloud. Assuming the Linux box stays secure, the malware cannot attack the backups directly.

You WILL need to keep enough historical full backups to cover the maximum time period between when ransom-ware STARTS encrypting your files and when you NOTICE it's happening. This might be considerably longer than a few days.

Ransom-ware authors face a trade-off between acting slowly to effect as much of the backups as possible and acting quickly to avoid detection and being stopped from further encryption of files.

Photo archives are likely a juicy target here as they are likely stored for a long time and not looked at and are often of high sentimental value. Slowly encrypting just someones photo collection (original and backup) MIGHT result in higher ransom payout than just trying to quickly attack an entire PC.

John McNamara
  • 696
  • 5
  • 7
  • Cool idea! Ransomware crawls available network drives, so you're not making the computer push the backup, you're making the backup pull the computer. I wonder if the several network-controlled backup solutions can work this way. – Ben Apr 19 '16 at 10:50
  • Backup to cloud storage can be safe against ransomware if it uses a log file storage, i.e. all file modifications are translated to appending a versioning log, and that the authorization token used in the machine is not permitted to alter historical logs. Most consumer grade cloud storages do support versioning, but I don't know if any can be restricted to append-only permission. – Lie Ryan Apr 19 '16 at 12:24
  • @LieRyan That's a clever and effectively a similar solution, 1 untrusted PC to provide source data and 1 trusted PC to act as backup process authority. It would be great if all the cloud vendors adopted a standard around this idea. I'm not aware of any that offer it, maybe some of Amazon's more complex solutions? – John McNamara Apr 19 '16 at 13:25
3

Let's summarize how ransomware works:

Ransomware will encrypt everything it finds. This includes:

  • All local drives
  • External media connected to your computer at the time of attack
  • Mounted network shares with write access

This provides you with the following possible precautions

  • Cloud storage without a locally installed client (not feasible for large quantities of files)
  • Cloud storage with local client, but you're always logged out unless a backup is due (easy to forget)
  • Offline drives, which you connect only when a backup is due
  • Read-only media such as DVD and Blueray (Good for archiving)
  • Network path where your user account does not have access to. Then, run the backup task from a different account that actually has access. The backup method must store multiple versions however. Otherwise the intact backups might simply be overwritten with the encrypted/corrupted versions in case the user does not notice the encryption early enough.

Personally I chose the last option for me because I still have fully automated backups. In the case that my computer would be compromised by ransomware, it would not be able to encrypt the network share due to the lacking permissions. Pair that with occasional offline backups and you should be fine.

Unfortunately, if we take all possible exploits into account, the single valid solution is using read-only media. A root-exploit, key-logger or similar renders most other methods useless. Because of the extra effort, these would probably only be used in attacks targeted at individuals (local office, restaurant etc.).

Potaito
  • 268
  • 1
  • 8
  • 1
    Last option needs: "...and hope the ransomware doesn't also include an exploit to get permissions to the network path." Maybe not a valid assumption in general, although probably safe for now, at least for common attacks. – Ben Apr 18 '16 at 18:20
  • @Ben You are right, I added the offline backups to my solution. – Potaito Apr 19 '16 at 06:27
2

The other answers address your first concerns very well. As on alternatives to keep files safe from ransomware while still not depending on third party solutions (i.e. cloud hosting):

Get another PC (one with a big storage unit) running a safer OS in your local network, and install a version control software (e.g. Subversion) server.

Commit your files into a working copy, and keep it synched through the version control client.

Subversion would be fine for that, since there won't be many conflicts from concurrent submits. You can script the synch commands to run on a schedule.

Mindwin
  • 1,118
  • 1
  • 8
  • 15
  • This option seems pretty interesting. Maybe using git offered by bitbucket... – Oscar Foley Apr 18 '16 at 13:38
  • @OscarFoley yes, any version control will work. Using a secondary machine has the advantage of LAN bandwidth and physical access. – Mindwin Apr 18 '16 at 14:30
  • 1
    In this case SVN or Hg would actually be better, due to git's destructive push capability. You want to be able to create historic versions that can't be removed. Really though, this is not a very realistic use of version control, which isn't really designed as a backup solution. – Ben Apr 19 '16 at 10:53
1

Any cloud storage service with versioning enabled will protect you from ransomware. The versioning option is key here, as you may need to recover a previous version if the malware changed your files in the cloud service.

AWS S3 has versioning as an option, but it is not enabled by default. DropBox does have versioning enabled by default. Google Drive for non-Google files (Google Docs, etc.) requires you to manually enable the "keep forever" option to do this.

There is however a relatively simple non-cloud mitigation for this threat.

  1. Set up a NAS or Linux file server for storing data, then add an external backup drive(s) to it.
  2. Let your computers use storage directly from the NAS/server, or backup local storage to it.
  3. Have a backup utility run directly on the NAS/server and set it to back up to the external drive.
  4. Ensure the external drive is not accessible from anywhere besides the NAS/server itself - via file permissions on Linux, or via a configuration option in the NAS GUI.

This secondary backup will be inaccessible to ransomware on your computer. If you were to be infected, you would clean the malware (duh) and then restore that secondary backup to your NAS/server.

Most NAS units have a builtin backup feature for external drives. If you're running a Linux server, rsnapshot will do the job for you. You can set it up in cron to run as often as you like - guaranteeing that you will not lose more than the data produced in that amount of time.

CrashPlan could also be used for local backup if you need dedupe/compression. (Though that will only run once a day unless you subscribe)

Though this takes some setup work, it can run mostly maintenance free until you fill your storage.

  • Reminds me of [BackupPC (OSS utility)](http://backuppc.sourceforge.net/info.html) that rsyncs your data to a remote location, but uses symlinks to keep historical data and reduce storage requirements. Best thing - it pulls data from the clients, so clients will not be able to see or write to the backup storage too. – gbjbaanb Apr 20 '16 at 15:43
1

I made my backup drives bootable with a small linux system doing an automatic rsnapshot and shutdown after that was successful. Since my data doesn't change that much, I can keep a high number of snapshots.

Oh, if you're really paranoid, you can measure the time the rsnapshot normally takes and if it all of a sudden takes a lot longer, that's a good indication that something's wrong on your system...

Thomas
  • 498
  • 2
  • 6