2

In terms of memory usage and processor utilization, how does Windows Server 2019 ReFS deduplication compare to ZFS deduplication?

ylluminate
  • 1,001
  • 2
  • 15
  • 29
  • ZFS dedupe is a non-starter in many environments. What your use case? – ewwhite Apr 24 '20 at 18:36
  • Mostly 40 years of a huge and messy family photo and video library, but also a lot of various (messy) backups. There's a huge chance deduplication would be a huge space saver. – ylluminate Apr 24 '20 at 18:48

1 Answers1

3

See some test results here: https://trae.sk/view/26/ https://trae.sk/view/33/

TL;DR: it's pretty good for data that actually is duplicated AND compressible. It's not exactly advertized but anything dedupped gets compressed as well. There's a blacklist of file extensions (with reasonable defaults) to skip for compression though. Your images and videos may run through dedup (there may be several files) but are likely excluded from compression. Your backups will likely will be hugely reduced in size as backups often contain repeated data.

There is a performance hit but for regular end-user computing, it's not noticeable - this is from experience, I have had hundreds of TiBs of deduplicated data (professionally) since Windows 2012.

I would not recommend ZFS deduplication in any measure except some very-very edge cases - it's practically unusable, even with recommended specs. I've tried it (and seen tried) with professionally and it's been a disaster most of the time.

Dedup engine is practically unchanged between 2012 and 2019 from efficiency perspective so results are the same, both NTFS and ReFS. Performance has gone up though over years.

Don Zoomik
  • 1,458
  • 9
  • 12