Windows backup with support for incremental binary backup

4

3

Currently, I'm using the Windows 8.1 file history backup, but it has a major weakness. Large files are copied in their entirety each time they change. For instance, I have some virtual machines files of gigabyte size. Each time I run this backup, it saves a full new version of the file, although probably the file just changed a few bytes. This is a HUGE waste.

I want to do some kind of binary/diff incremental backup function, so that each new backups save only the changes in binary files and space requirements are minimized. How can I do this?

Sideshow Bob

Posted 2014-01-06T19:34:18.560

Reputation: 61

Just a word of caution - be careful with incremental backups as many of them (if not all) store said increments in proprietary format, meaning that you will not have an access to your data if the backup program stops working / becomes unavailable / etc. Just make sure to pay attention to this angle. – Angstrom – 2014-01-08T07:14:23.033

Yes, I see this as a reason to go with one of the larger brands at least? – Sideshow Bob – 2014-01-08T09:03:35.123

Answers

2

I found a solution: Areca Backup Free backup software with some really nice features, like delta copy of files. To get shadow copy, you need a plugin though. Further, Duplicati seems very powerful. Both are free, but I suggest a donation is justified if you find those useful.

Sideshow Bob

Posted 2014-01-06T19:34:18.560

Reputation: 61

Areca Backup is unmaintained for a few years now – golimar – 2019-12-11T11:18:57.367

2

No real answer but some thoughts:

To do an in-place update of some remote copy of your file, you have to first find out which parts of the file have changed or have been added since the last copy. One could maintain a list of checksums for chunks of n megabytes. Without such a list, you have to read the remote copy and the local copy to find the changes.

Drawback of an in-place update would be the inherent risk of data corruption. If the update fails or is aborted, the remote copy is left in an inconsistent state.

The choice of a good replication strategy depends on several parameters:

  • Is it feasible to read the remote copy efficiently?
    What is the speed to do this?
  • Are changes scattered across the whole file or more or less local?
  • What is the probability that changes in a chunk of data go unnoticed due to clashes in the checksum?
    The checksum algorithm has to be chosen in such a way that this is sufficiently unlikely.

I think you have a good and valid point in your idea. It would be very practical - for example - to save big Outlook .pst files on a memory stick. However, I am not aware of any tool or method which solves all the obstacles mentioned above.

Axel Kemper

Posted 2014-01-06T19:34:18.560

Reputation: 2 892

Software like Acronis True Image, does it do incremental backup on a file-level or on a "disk"-level? Anybody? – Sideshow Bob – 2014-01-07T12:42:55.497

Acronis is using Changed Block Tracking (CBT) (http://www.v-front.de/2011/06/quick-primer-on-changed-block-tracking.html). VM disk blocks are marked as changed. The backup procedure can thus copy solely the changed blocks.

– Axel Kemper – 2014-01-07T13:34:35.650

0

If you don't need to keep an archive of past file versions, then Bvckup2 does exactly what you want.

Angstrom

Posted 2014-01-06T19:34:18.560

Reputation: 610

1This is close, but not quite there. Definitely in the right direction – Sideshow Bob – 2014-01-08T09:04:19.783

What is it missing that you need? – Angstrom – 2014-01-08T09:13:03.110

1...the ability to keep "versions" – Sideshow Bob – 2014-01-08T13:44:24.440