199

RSA Security commonly uses keys of sizes 1024-bit, 2048-bit or even 3072-bit. And most Symmetric algorithms only between 112-bit and 256-bit. I do realize that the current keys are secure enough for today's hardware, but as computers get faster, should we not consider an insanely large key size like a million bits or so to protect ourselves against super computer systems that has not been invented yet?

So in other words what is the consequences of choosing a cipher key that is too large and why does everyone restrict their key sizes?

Koning
  • 1,633
  • 3
  • 11
  • 5
  • There is the page 32 of this [document](http://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/algorithms-key-sizes-and-parameters-report/at_download/fullReport "Algorithms, Key Sizes and Parameters Report") from the [enisa](http://www.enisa.europa.eu "European Union Agency for Network and Information Security") about the worth of longer keys in the case of RSA. – user2284570 Dec 24 '13 at 23:07

8 Answers8

307

I dug out my copy of Applied Cryptography to answer this concerning symmetric crypto, 256 is plenty and probably will be for a long long time. Schneier explains;

Longer key lengths are better, but only up to a point. AES will have 128-bit, 192-bit, and 256-bit key lengths. This is far longer than needed for the foreseeable future. In fact, we cannot even imagine a world where 256-bit brute force searches are possible. It requires some fundamental breakthroughs in physics and our understanding of the universe.

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38 × 10−16 erg/K, and that the ambient temperature of the universe is 3.2 Kelvin, an ideal computer running at 3.2 K would consume 4.4 × 10−16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our sun is about 1.21 × 1041 ergs. This is enough to power about 2.7 × 1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.

But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

The boldness is my own addition.

Remark: Note that this example assumes that there is a 'perfect' encryption algorithm. If you can exploit weaknesses in the algorithm, the key space might shrink and you'd end up with effectively less bits of your key.

It also assumes that the key generation is perfect - yielding 1 bit of entropy per bit of key. This is often difficult to achieve in a computational setting. An imperfect generation mechanism might yield 170 bits of entropy for a 256 bit key. In this case, if the key generation mechanism is known, the size of the brute-force space is reduced to 170 bits.

Assuming quantum computers are feasible, however, any RSA key will be broken using Shor's algorithm. (See https://security.stackexchange.com/a/37638/18064)

lynks
  • 10,636
  • 5
  • 29
  • 54
  • 7
    What about quantum computing? – bseibold Dec 17 '12 at 21:38
  • 5
    Isn't that just the energy limit for irreversible computation? If you compute with reversible gates, you can theoretically use much less energy. – tzs Dec 17 '12 at 21:49
  • 8
    @tzs the physics involved represent the bare minimum energy requirement just to represent information in the most efficient sense that quantum mechanics will allow, well beyond even theoretical technology, not even counting the energy for computing, gates, and all that. – tylerl Dec 18 '12 at 02:58
  • Chances are you wouldn't need to cycle through all the possibilities. – OpenCoderX Dec 18 '12 at 21:46
  • 6
    @opensourcechris correct, you would have to cycle through exactly half of them on average, so 2^255 keys need to be checked, which is still around 10^12 times more than the energy released in a supernova. – lynks Dec 18 '12 at 22:31
  • 2
    @tylerl That minimum energy requirement is per irreversible operation. Since quantum computers do more with a single operation, you need to double the number of bits. Then there is the issue of reversible computing. If you build a computer where each operation is reversible, you can go beyond those limits. But it's completely unclear if reversible computers can be built and if they can be applied to cryptography problems like this. – CodesInChaos Sep 09 '13 at 09:06
  • 5
    OK everybody, no more voting on this. A net +256 vote is perfect! – user Aug 12 '14 at 09:22
  • @MichaelKjörling - I'm not willing to downvote for something like getting back down to 256, so I guess it's lost cause... unless.... Tell you what: I'll wait and if I ever notice it at 257, I'll toss a downvote his way - hope that's welcome :-P – Code Jockey Oct 05 '15 at 19:50
  • @tylerl: tzs and CodesInChaos point to the fact that Schneier's analysis appears to be based on Landauer’s principle which states, in effect, that there is a minimum bound on the amount of energy required to erase one bit of information. It indeed pertains only to irreversible bit manipulation. The “Landauer limit” has been the focus of considerable research. While Schneier is brilliant, no one has a crystal ball. Recent results by Miquel López-Suárez et al. suggest that indeed there is no lower limit to the energy required for computation: http://www.nature.com/articles/ncomms12068 – Harlan Mar 07 '17 at 18:22
  • @tylerl The López-Suárez result (above) has motivated research on “zero-power computing” and implies that physics-based arguments for the absolute security of 128-bit keys (i.e., quantum-reduced 256-bit keys) might need to be revisited. See: http://iopscience.iop.org/article/10.1088/0957-4484/26/22/222001/meta. Whether zero-power computing will ever come to pass is anyone's guess, but the thermodynamics argument no longer seems airtight. – Harlan Mar 07 '17 at 18:23
  • 1
    Even AES-256 may not be safe for long. See these two articles: [TEMPEST Attacks Against AES](https://www.fox-it.com/en/insights/blogs/blog/tempest-attacks-aes/) and [Experts Recover AES256 Encryption Key From a PC's Electromagnetic Emissions](https://www.bleepingcomputer.com/news/security/experts-recover-aes256-encryption-key-from-a-pcs-electromagnetic-emissions/). – Chiramisu Sep 08 '17 at 18:28
  • Incidentally, this seems to me an excellent proof of the existence of a Creator, since even the most “simple” organisms from which abiogenesis proposes we all evolved comprise much more than 256 bits of information. – Aryeh Leib Taurog Jan 20 '19 at 10:46
  • Why would you need a heap pump during computation? Could one build a room, drain it to 0K (here you need HP), and then fully isolate it before computing (now you no longer need HP)? Assuming the computer perfectly use the energy you give it (so it doesn't raise the room temperature) would it mean kT = 0 and that physical disappear? – Xenos Apr 23 '20 at 13:06
79

The reason why RSA keys are so small is that:

With every doubling of the RSA key length, decryption is 6-7 times times slower.

So this is just another of the security-convenience tradeoffs. Here's a graph: RSA Decryption time by key length

Source: http://www.javamex.com/tutorials/cryptography/rsa_key_length.shtml

Fredefl
  • 1,011
  • 8
  • 4
  • 2
    +1. Using big key lengths for "offline" asymmetric crypto (like PGP) is often applied, but for "online" key-exchanges, a 2048-bit key for 30-year security is sufficient for most applications, and doesn't annoy the user with a 2-minute wait during the SSL handshake. – SecurityMatt Dec 13 '12 at 13:55
  • 16
    Keep in mind that asymmetric cyphers are usually used only to protect symmetric session keys, so this increase in asymmetric cypher decryption time is not *that* dramatic in practice. – Alexander Shcheblikin Dec 13 '12 at 14:28
  • 1
    But when you bear in mind that Browsers are already resorting to dirty tricks like DNS prefetching and just-in-time compiling javascript, you'll realize that the cost of going from 4096-bit keys to 65536-bit keys matters. And also, since the best known attacks on 2048-bit RSA is more work than brute-forcing the AES-256 key, there's no cryptographic benefit to doing it either. – SecurityMatt Dec 13 '12 at 18:23
  • 3
    @SecurityMatt Any source for that claim? The claims I heard is that breaking 2048 bit RSA is about as hard as breaking a 112 bit symmetric algo, not harder than breaking 256 bit encryption. – CodesInChaos Jan 26 '14 at 14:03
  • @SecurityMatt *yet...* – Craig Tullis Oct 17 '15 at 04:21
  • Do you have similar graphs for EEC and AES? – Dan W Nov 25 '15 at 04:36
66

For one AES is built for three key sizes 128, 192 or 256 bits.

Currently, brute-forcing 128 bits is not even close to feasible. Hypothetically, if an AES Key had 129 bits, it would take twice as long to brute-force a 129 bit key than a 128 bit key. This means larger keys of 192 bits and 256 bits would take much much much longer to attack. It would take so incredibly long to brute-force one of these keys that the sun would stop burning before the key was realized.

2^256=115792089237316195423570985008687907853269984665640564039457584007913129639936

That's a big freaking number. That's how many possibly keys there are. Assuming the key is random, if you divide that by 2 then you have how many keys it will take on average to brute-force AES-256

In a sense we do have the really big cipher keys you are talking of. The whole point of a symmetric key is to make it unfeasible to brute-force. In the future, if attacking a 256bit key becomes possible then keysizes will surely increase, but that is quite a ways down the road.

The reason RSA keys are much larger than AES keys is because they are two completely different types of encryption. This means a person would not attack a RSA key the same as they would attack an AES Key.

Attacking symmetric keys is easy.

  1. Start with a bitstring 000...
  2. Decrypt ciphertext with that bitstring.
  3. If you can read it, you succeeded.
  4. If you cannot read it then increment the bitstring

Attacking an RSA key is different...because RSA encryption/decryption works with big semi-prime numbers...the process is mathy. With RSA, you don't have to try every possible bit string. You try far fewer than 2^1024 or 2^2048 bitstrings...but it's still not possible to bruteforce. This is why RSA and AES keys differ in size.[1]

To sum up everything and answer your question in 1 sentence. We don't need ridiculously big symmetric keys because we already have ridiculously big symmetric keys. 256 bit encryption sounds wimpy compared to something like a 2048 bit RSA Key, but the algorithms are different and can't really be compared 'bit to bit' like that. In the future if there is a need to longer keys then there will be new algorithms developed to handle larger keys. And if we ever wanted to go bigger on current hardware, it's simply a time tradeoff. Bigger key means longer decryption time means slower communication. This is especially important for a cipher since your internet browser will establish and then use a symmetric key to send information.

8

Processing time, pure and simple. Everything in security is a balancing act between the need for security (keeping the bad people out), and useability (letting the good people in). Encryption is a processing expensive operation even with dedicated hardware for doing the calculations.

It simply isn't worth going beyond a certain level of security for most purposes because the trade offs become exponentially harder to use while offering almost no tangible benefit (since the difference between a billion years and a hundred billion years isn't that significant in practical terms).

Also, as for RSA vs AES, that's the nature of symmetric versus asymmetric cryptography. Put simply, with symetric cryptography (where there is one shared key), there is nothing to start guessing from, so it is very hard. For asymmetric cryptography such as RSA, you are disclosing a piece of information (the public key) that is related to the decryption key(the private key). While the relationship is "very hard"tm to calculate, it is far weaker than having no information to work from. Because of this, the larger key sizes are necessary to make the problem of getting a private key from a public key harder while trying to limit how much harder the math problems are for encryption and decryption.

AJ Henderson
  • 41,816
  • 5
  • 63
  • 110
7

In a way, algorithms using such "insanely large" keys already exist. It's called one-time pads. Nobody really uses them in practice, though, since they require a key the length of the message you wish to encrypt and key material can never be reused (unless you want the ciphertext to become trivially breakable). Given that the purpose of encryption is to convert a large secret (the plaintext) into a small secret (the key), OTPs are only useful in very specific and highly specialized scenarios. You might as well transmit the plaintext securely, because you will need just as secure a transmission channel for the key material.

In all fairness, OTPs do have one specific use case. That is, when you need provable security in an environment where you have access to a secure communications channel at one point but need to transmit a message securely at some later time.

The OTP is provably secure because properly used and with properly generated key material, any decrypted plaintext is equally likely, and nothing about one part of the key material (is supposed to) give you any insight into other parts of the key material or how to decrypt other parts of the message. That's easy in theory, but awfully difficult to pull off in practice. You are looking at short, high-grade military or possibly diplomatic secrets, at most.

For most people, 128- to 256-bit symmetric or 2048- to 4096-bit (assuming something like RSA) asymmetric keys is plenty enough, for reasons already described by Rell3oT, Alexander Shcheblikin, lynks, and others. Anyone wanting to attack a 256-bit-equivalent key is going to attack the cryptosystem, not the cryptography, anyway. (Obligatory XKCD link.) PRNG attacks have broken otherwise properly implemented and theoretically secure cryptosystems before, and one would be a fool to think that has happened for the last time.

user
  • 7,670
  • 2
  • 30
  • 54
5

Adding more evidence to the "because it slows things down unnecessarily" answers, it seems like AES execution time doesn't grow as fast as RSA when key length goes up (and RC6 grows even more slowly), but it's still a 16% execution time increase to double key length, according to http://www.ibimapublishing.com/journals/CIBIMA/volume8/v8n8.html .

Cyrus
  • 51
  • 2
  • AES-128 is 10 rounds, AES-192 is 12 rounds, and AES-256 is 14 rounds. Because AES' key schedule is trivial, execution time scales almost exactly linearly with the number of rounds and amount of data to be processed. Hence, going from AES-128 to AES-256 should reduce throughput (in terms of amount of data per period of time) by almost exactly 40%. – user Feb 20 '16 at 13:21
  • On RSA algorithm, the algorithm works the same regardless of the size of key. This is common benefit of algorithms working in group: just make the field larger. For symmetric cryptography, most algorithms contain steps which cannot be scaled for different key size, without significantly altering the algorithm. For instance, AES minimum key size is 128 bits and maximum is 256 bits. The enhanced AES algorithm that accepted 512-bit keys would possibly be substantially different from AES (or Rijndael). – user4982 Mar 03 '18 at 20:04
5

Processing time was already mentioned. Even in that respect the time required to generate an RSA key should be mentioned separately, since it is MUCH more costly for longer keys, since you need to find prime numbers of roughly half the size of the desired RSA key.

Another topic is space, i. e. the amount of data generated. Asymmetric and symmetric ciphers operate on blocks. Even a signature over one byte need the number of bits the RSA key has, i. e. 256 byte for a 2048 bit key. Same situation with block sizes of the symmetric algorithms, which als grows with the key length. While this also seems not an argument at first sight, some severely restricted devices as smart cards are supposed to handle this (still the most secure container for the private key) and there are many applications which need to store a lot of signatures, certificates, cryptograms etc. Actually this is one of the reasons for elliptic curve cryptography, since it is cramming more security into fewer bits.

guidot
  • 161
  • 3
  • *"block sizes of the symmetric algorithms, which als grows with the key length"* That is not necessary. DES: 64 bit key (minus eight bits ignored), 64 bit block. IDEA: 128 bit key, 64 bit block. AES: 128, 192, 256 bit key, fixed 128 bit block. Blowfish: up to 448 bit key, fixed 64 bit block. RC5: up to 2040 bit *symmetric* key, 32/64/128 bit block (selectable, 64 bit block suggested). And so on. – user Aug 12 '14 at 09:50
  • @MichaelKjörling: I actually can't recognize the benefit of a block size argument if the whole algorithm family is changed. Within one algorithm family larger key sizes mean larger block sizes, but this is also no strict increase. The original question addressed key sizes and did not suggest to change algorithms too, so what is your point? – guidot Aug 12 '14 at 10:25
  • OK, well, even within the exact same algorithm (and out of the examples already cited), we've still got AES which can use several different key sizes but always the same block size. Or Blowfish, which has variable key size and fixed block size. – user Aug 12 '14 at 11:11
2

The OP asked: "So in other words what is the consequences of choosing a cipher key that is too large...?" A 256-bit key is plenty strong, as proven by the comments here; however, a very secure key (which is a good thing) will simply cause a malicious person(s) to find a weakness elsewhere in the system.

brandtrock
  • 21
  • 2