2

What are pros and cons for using disk level encryption like VeraCrypt and then turning on compression on mounted (virtually decrypted) drive? It seams to me that this can even increase some performance if data is compressed very good as it will decrease size of files that need to be encrypted. Did anyone do benchmarks of configuration like this?

I can see that compression of already encrypted drive is futile as encrypted data will compress very bad so this will give me only performance downgrade, but encryption of compressed files looks interesting.

watbywbarif
  • 121
  • 4
  • 2
    one pro: ciphertext has less entropy and hence is harder to attack – SEJPM Apr 21 '15 at 15:20
  • Ciphertext has *less* entropy? – RoraΖ Apr 21 '15 at 15:28
  • More, but conclusion stays the same :) – watbywbarif Apr 21 '15 at 15:33
  • 3
    I think he means that, once compressed, the (compressed) clear text has more entropy (more information per data block) which, in turn, should increase the entropy of the resulting cyphertext as well. That being said, this last thing [isn't always true](http://security.stackexchange.com/questions/19911/crime-how-to-beat-the-beast-successor/19914#19914). – Stephane Apr 21 '15 at 15:36
  • 1
    Have you run a test? This seems easier to test in practice than to guess in theory. – Neil Smithline Apr 21 '15 at 19:04
  • Another thought. An SSHD really boosts disk I/O performance. Probably much more than compression. Doesn't answer your question but may fix your problem. – Neil Smithline Apr 21 '15 at 19:06
  • @Stephane, that was exactly what I meant and I guess the mentioned attack doesn't apply for disk encryption. I'd highly doubt that one can attack disk encryption via the compression if everything is random-looking (as is the case for VeraCrypt) – SEJPM Apr 21 '15 at 21:03
  • @NeilSmithline I did't but will do. I have 128GB SSD and compression could give some extra space, but I am worried about performance, so If no one already has some benchmarks I will make them and share here when i catch some time. – watbywbarif Apr 22 '15 at 05:48
  • Very light compression (LZO or LZ4) may be beneficial, depending on the types of files being stored. – forest May 23 '18 at 23:43
  • 1
    One test: https://sourceforge.net/p/veracrypt/discussion/general/thread/a02ce303f9/ – Ray Woodcock Apr 28 '22 at 02:53

3 Answers3

1

Compression is Often a Bad Idea, It is a performance killer (unless your using Hardware compression), Most Binary information Compresses very poorly. Only Text is known to save you a lot of disk space when storing. But compressing a volume exposes it to more risk of data loss (not only the encryption but also the compression algorithm can leak data or lose data.)

I have not done any measurements myself recently but I would suspect it only adds complexity and does not do a performance enhancement.

LvB
  • 8,217
  • 1
  • 26
  • 43
  • On what do you base the assumption that compression is a performance killer ? It all depends on the context. On the contrary: many binary data is highly structured and will therefore compress pretty well. – Stephane Apr 21 '15 at 15:39
  • 1
    Compression is a 'search and replace' sort of algorithm. you have to pass through the whole file/block find all the common elements, rewrite them into a way that takes less space and store/send it. This means iterating over the same data set N-times. When you do this, your performance is decreased significantly. As opposed to an Cartographic Solution where a block/file is Arithmetically altered to produce a Ciphered text. This can be Heavy as well but it does not contain any intricate loops (unless your Crypto suite Does). – LvB Apr 21 '15 at 15:46
  • Most Binary data is less structured than Human readable text. For a simple reason. Humans have a much narrower set of allowed states to communicate meaning over. (26*2 letters 10 digits about 10 'signs') where as a computer uses the full spectrum of the binary slot (so 256 values per byte). This difference is the root cause why text compresses well and Binary data does not. – LvB Apr 21 '15 at 15:49
  • 2
    I have to disagree by nature binary data is structured because programs need to know how data is presented to process it. Beginning of an ELF header: `7F 45 4C 46 02 01 01 00 00 00 00 00 00 00 00 00 03 00 3E 00 01 00 00 00 60 16 00 00 00 00 00 00 40 00 00 00 00 00 00 00 40 64 00 00 00 00 00 00` Pretty sure those zeros will compress. – RoraΖ Apr 21 '15 at 16:31
  • 1
    There is big difference between binary and compressed binary in this case, being binary does not imply compressed, although most music, video and images will be usually compressed, so one should consider this when choosing which drive/folder to compress. It would be best if one could compress entire drive but algorithm skips known compressed formats. – watbywbarif Apr 22 '15 at 06:06
  • @raz Loss less compression means you need to be able to reconstruct the data, since the headers are only a fraction of the file, the zeros inside it is irrelevant. In order to reconstruct the data you must encode it in a way that takes less data (inc. the database), Since the variation in Binary data is much higher than for Text the compression opportunities are less frequent and the data base is often much bigger. Adding the fact that many binary files are already compressed in a way and compressing a compressed file does not work. the Additional work of compressing the disk is wasted. – LvB Apr 22 '15 at 21:34
  • @LvB For executables, compression can be efficient even without the headers. The [`xz`](https://www.freebsd.org/cgi/man.cgi?query=xz&sektion=1&manpath=FreeBSD+8.3-RELEASE) utility can apply BCJ filters, which makes subsequent LZMA compression more efficient. For example, relative offsets can be converted to more repetitive absolute offsets, improving compression by up to 15%. – forest May 23 '18 at 01:59
  • @forest compression will always add information to the set. meaning that it will require you to add and track more details than without compresssion. the question was based on disk level compression. where you can get huge page files for decompression. it just is not efficient to employ for a running system. for a backup. yes totally go for it. – LvB May 23 '18 at 10:10
  • Yeah that's a good point. Applying BCJ filters to an entire disk would be silly. – forest May 23 '18 at 23:32
0

I'm copying a lot of data to a USB hard drive and noticed that it copies data at about 12.4 MB/s unencrypted and only 9.8 MB/s encrypted. This is on an old Core 2 duo computer and so the loss of performance will likely be significantly less with newer faster processors. The drive was encrypted using standard Windows encryption.

Ken P
  • 1
0

The advantage of compression comes in when you have something like a backup device (like for your documents), a volume where you do not alter the date (edit the files/ increase their size). If you data is working data, that alters very often, it's not recommended to do any compression. Also, if you have videos or very large files, it's not at all useful to do any compression because the difference in space would be insignificant.

Overmind
  • 8,779
  • 3
  • 19
  • 28