6

I have read the documentation for DDPEval.exe and various blogs and so on and cannot find any information on how to interpret the results.

I ran the tool on a drive on one of our servers and found the results confusing, particularly with regards to "Optimized files" and "compression". To try and get some more understanding I ran the tool again on a specific folder only, where is there no NTFS compression in place (contents are SQL Server backups). For brevity I will post only the results from this second test:

Evaluated folder size: 69.07 GB
Files in evaluated folder: 6

Processed files: 6
Processed files size: 69.07 GB
Optimized files size: 15.61 GB
Space savings: 53.45 GB
Space savings percent: 77

Optimized files size (no compression): 69.04 GB
Space savings (no compression): 26.93 MB
Space savings percent (no compression): 0

What do these numbers mean and is the tool telling me I can save 53.45GB or 26.93MB?

Stephen Kennedy
  • 161
  • 1
  • 10

2 Answers2

3

You can save 53.45GB reducing space usage by 77% in case if you use both deduplication and compression.

You can save 26.93MB if only the deduplication is used.

Usually, deduplication gains are calculated using the deduplication ratio. I would recommend you using an alternative free tool called Deduplication Analyzer: https://www.starwindsoftware.com/starwind-deduplication-analyzer.

It gives you a more transparent and understandable outcome providing an industry-standard Deduplication ratio parameter.

Deduplication analyzer result screen

Net Runner
  • 5,626
  • 11
  • 29
0

Compression and optimization/dedup-land are not the same thing. Compression takes a given file and stores the data differently so that parts that are the same are stored once with pointers and some recalculation about how to organize the file instead of multiple times.

Deduplication/optimization is comparing multiple files across each other. If you have 10 copies of a specific file, or block in a file, you can instead have one copy with 10 pointers to that copy. This does not modify the structure of the file itself, it just tells the OS "hey, when you want file xyz.txt from location Z instead go find it at location A."

I've not worked with the tool, so I'm slightly guessing here, but it seems to be saying if you leave compression off and let it deduplicate for that file across all instances of it, you'll get things down to 29.63 MB.

But if you enable compression it can get things down even further - I'm guessing because it can dedup blocks across compressed files.

Keep in mind, though: compression does give you file access overhead because it requires calculation on the part of the OS to reconstruct what the contents of the file should look like. This is more effort than deduplication which also requires reconstruction but requires "go find this part" vs. compression requires "calculate this part" when it retrieves something.

Mary
  • 565
  • 5
  • 10