12

Consider symmetric GPG encryption of a given file my_file.txt. Something like (in command line)

gpg --symmetric --cipher-algo AES256 my_file.txt

After suppying the prompt with the new password, the above produces my_file.txt.gpg. I could then encrypt again:

gpg --symmetric --cipher-algo AES256 my_file.txt.gpg

(where you would want to set a different password)

And so on. Is there a limit on how many iterations of the above I can do? It seems to me there isn't, as symmetric encryption just takes a piece of text and transforms it into another, without ever asking what the piece of text is in the first place. Is this true?

luchonacho
  • 1,341
  • 2
  • 9
  • 14
  • 13
    Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used. – kelalaka Feb 18 '19 at 10:03
  • 5
    no there's not, not until you run out of harddrive space from the encryption size overhead, anyway. – user1067003 Feb 18 '19 at 12:05
  • 4
    However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort. – A. Hersean Feb 18 '19 at 14:53
  • 3
    @A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords? – Chris H Feb 18 '19 at 16:18
  • 2
    @ChrisH Yes, it is useful if the attacker brute-forced or intercept one of the passwords, but not if he found a vulnerability in the algorithm used to encrypt the data. – appa yip yip Feb 18 '19 at 17:15
  • 2
    @ChrisH if they "found" one of the passwords, it's highly likely they can get the second one. After all, if you enter password one that just nets you with an encrypted still unusable file. If an attacker can intercept password one, they very likely have the means to intercept password two. Double encryption helps if you want to protect against brute forcing, I guess, where the attacker has to do double the work. – VLAZ Feb 18 '19 at 17:24
  • 1
    @ChrisH There is another situation - if you have an encrypted (once) file pass through different systems and each encrypts it (second time) while within its domain. So, you want to protect from a system in the chain having its password stolen only, but the attacker doesn't have the end destination password where the file is finally decrypted to plaintext. Still, if you *expect* a compromised system in a chain of such, you should probably just never encrypt/decrypt before the final destination. And be better at securing the systems. – VLAZ Feb 18 '19 at 17:25
  • 3
    @VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual. – Chris H Feb 18 '19 at 17:41
  • @user1067003 It's perfectly feasible to have no space overhead for encryption. Of course, most encryption utilities include things like authentication tags, headers, metadata, salts, etc. that add to overhead. – forest Feb 19 '19 at 01:52
  • 1
    @A.Hersean If there's no way for an attacker to verify that the first key is correct without also correctly guessing the second key, then an n-bit cipher used twice with two keys gives you the equivalent of a 2n-bit cipher. If an attacker can verify that the first key is correct (e.g. with a header or magic bytes, as is done with GnuPG), then using the same cipher twice with two keys gives you only the equivalent of an n+1-bit cipher. – forest Feb 19 '19 at 01:56
  • @VLAZ Notice [this question](https://security.stackexchange.com/q/203830/185302), related to your comment. (will delete this comment soon) – luchonacho Feb 19 '19 at 15:43
  • @ChrisH Notice [this question](https://security.stackexchange.com/q/203830/185302), related to your comment. (will delete this comment soon) – luchonacho Feb 19 '19 at 15:45
  • 1
    @forest I completely agree with you. However, the first case you mention is rare, as most encryption schemes use a MAC derived from the encryption key (or another way to verify the password), so I overlooked it. – A. Hersean Feb 19 '19 at 18:17

4 Answers4

24

Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.

The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.

For example, if your file was first encrypted with algo1 with password abcde, and then it was encrypted with algo2 with password vwxyz, then it will have to be decrypted first with algo2 (with password vwxyz), and then with algo1 (with password abcde).

This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.

pri
  • 4,438
  • 24
  • 31
  • 38
    Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser. – Philipp Feb 18 '19 at 16:55
  • 10
    LIFO **may** not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (so `encryptA(encryptB(m)) == encryptB(encryptA(m))`). Now I don't think that this *necessarily* would decrease the encryption strength. AFAIK current algorithms do not support this. – Giacomo Alzetta Feb 18 '19 at 17:55
  • 4
    @GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text. – Paŭlo Ebermann Feb 18 '19 at 22:33
  • 2
    Combining encryption algorithms may also not improve security; double DES [exemplifies this](https://security.stackexchange.com/a/57061/46979). I imagine it's also possible to actually make the encryption weaker or introduce new attack vectors. – jpmc26 Feb 18 '19 at 23:48
  • @GiacomoAlzetta It does decrease encryption strength to less than the sum of the two algorithms through a meet in the middle attack, but not significantly enough to be worried about it. It does, however, come with the nasty side-effect that encryption and decryption are the same operations, so encrypting twice with the same password results in no encryption at all. This is how most stream ciphers operate. – forest Feb 19 '19 at 01:49
  • 2
    @jpmc26 It wouldn't be possible to make the encryption _weaker_ unless you are covertly encoding some aspect of the plaintext in the ciphertext size with the first cipher so the information "leaks" through the second, secure cipher in a way that wouldn't happen if the second cipher was used on its own. – forest Feb 19 '19 at 01:50
  • @forest Not that I don't believe you, but is that mathematically proven or is it just that we don't know of any way it could happen? – jpmc26 Feb 19 '19 at 02:23
  • 2
    @jpmc26 Assuming there is no change in size or other underhanded techniques to leak information, and assuming independent keys, it is trivially proven that a series of cascaded ciphers is as secure as the strongest cipher. This is most obvious with stream ciphers. This doesn't necessarily imply that adding different block modes into the mix doesn't violate the proof, though, e.g. with a padding oracle attack as mentioned in https://crypto.stackexchange.com/a/33817/54184. – forest Feb 19 '19 at 02:26
  • 2
    Finally found a proof of IND-CPA security for cascaded ciphers: https://eprint.iacr.org/2002/135.pdf – forest Feb 19 '19 at 02:46
  • 11
    @Philipp ...that you're accessing remotely via SSH. Over WPA2-protected Wi-fi. And a VPN. And the hard drive is encrypted. And the media files are in an obscure language that you have to translate. – JesseTG Feb 19 '19 at 04:43
  • I agree that LIFO might not be necessary every time, but then applying LIFO (with correct passwords) will get the correct results every time. – pri Feb 20 '19 at 05:21
8
  • So, sooner or later you will be out of space.

GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.

While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...

These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.

kelalaka
  • 5,409
  • 4
  • 24
  • 47
  • 3
    I was going to answer "about 2^64" times for similar concerns. – Joshua Feb 19 '19 at 03:36
  • @Joshua I wrote a comment since I've didn't time. When I've come back saw that nobody did, I wrote. Long before 2^64, one will be slower since the file will become hard to manage. – kelalaka Feb 19 '19 at 07:13
7

It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.

You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):

Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.

(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)

EDIT: See this question for more details on this edge case.

  • 5
    The thing is, XOR (or `add mod n`) with a one-time pad is pretty much the *only* algorithm that can be decrypted in an arbitrary order like this. – Martin Bonner supports Monica Feb 18 '19 at 14:21
  • 2
    @MartinBonner What about the algorithms used for [Diffie-Hellman key exchange](https://en.wikipedia.org/wiki/Diffie–Hellman_key_exchange)? – Solomon Ucko Feb 18 '19 at 14:27
  • 3
    @SolomonUcko Key Exchange is not encryption. What did you have in mind? – Martin Bonner supports Monica Feb 18 '19 at 17:13
  • 1
    @MartinBonner Good point, nevermind. I wasn't paying enough attention. – Solomon Ucko Feb 18 '19 at 17:17
  • 3
    Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption. – Peter Green Feb 18 '19 at 20:18
  • 5
    This "answer" looks to me like it is actually a comment regarding one of the other answers, not an answer to the question... – hft Feb 18 '19 at 22:22
  • The simple XOR cipher has the properly that data encrypted twice with keys A and B can be decrypted using *one* key (having the value A xor B). Which makes multiple-XOR encryption rather pointless, as it can never be stronger than a single XOR cipher. – dan04 Feb 19 '19 at 00:16
  • 1
    @dan04 "Which makes multiple-XOR encryption rather pointless": I don't think that's the case. As I understand it, one time pad basically shifts the problem from "How to I share this data securely" to "How do I share this key securely." If you share A and B in different ways, an attacker needs to be able to intercept _both_ of those communications to decrypt the file (even partially). Using a single key (even sending half over one channel, and half over the other), would still allow an interceptor to determine half of the plaintext if they intercept it. – Steven Jackson Feb 19 '19 at 13:23
  • 2
    @hft I believe I addressed that in my final (non-edit) sentence. I was restricted by my current reputation, and sought to improve OP's understanding of the topic as much as I could given my constraints. I now have enough rep to comment, but am not sure if deleting an answer that's generated (IMO) relevant discussion is appropriate at this point. – Steven Jackson Feb 19 '19 at 13:27
  • The question is specifically about symmetric GPG encryption though. Does this work there? – pipe Feb 19 '19 at 15:19
2

As many have already observed you will have a small increase in the size of your file after each level of encryption, due to the IV that needs to be added to the file after each encryption. Not really relevant.

Rather, I'd like to observe that your motivation for doing that is obviously to increase the robustness of your ciphertext against attacks, including brute-force ones. If you use a key of $n$ bits for each of $h$ levels of encryption, let's say keys $k_1, k_2, \ldots, k_h$, you maybe expect to get the robustness of one unique encryption based on one longer key of $h\times n$ bits. Theoretically speaking, it is possible to run the Meet-in-the-Middle attack, that allows an adversary to reduce the size of the key space to less than $2^{h\times n}$. A practical example is 2-DES, where the plaintext $P$ is first encrypted by a DES key (56 bits), thus obtaining a ciphertext $C'$, and then $C'$ is again encrypted by another DES key, thus obtaing the final ciphertext $C$. However, the expectation of having a size of keyspace equal to $2^112$ is wrong. The actual size will be $2^57$ because the Meet-in-the-Middle attack, that is a known-plaintext attack (meaning that the adversary knows a pair $(P, C)$), will allow the adversary to first build a table of all $2^56$ possible encryptions of $P$ (one for each potential key) and then generate all the possible decryptions of $C$ (again, one for each potential second key) and, for each of them, let's call it $C''$, check whether $C''$ is equal to some of the potential ciphertexts in the table. In case of match we got $C'' = C'$ and we have discovered the two DES keys. Total number of encryptions/decryptions will be $2^56\times 2 = 2^57$.

For similar reasons, 3-DES (three levels of DES encryption using three different keys) offers the security of a 112 bits key.

flex
  • 549
  • 3
  • 5