19

This question is on the assumption that any data once encrypted, may (eventually) be decrypted through

  1. Brute force (compute power/time)
  2. Exploits in the cryptography used
  3. Theft of private keys

Most threat models, procedures, and business interaction I've been involved with has focused on the protection and security of current or future data, but not so much on what's needed to maintain the safety of data previously encrypted.

"Previously encrypted data" may not only include encrypted messages or files, but also may include an capture of a previous SSH or VPN session.

Is there any discussion in the IT Security community about data that was encrypted on a given date in relation to Moore's law, Cloud computing, and time all working as factors that might decrypt previously encrypted information?

Is there discussion on the theft of a private key that would put prior data at risk?

Many people in the business and legal departments consider encryption an infallible lockbox that can never break, and don't see it as an encrypted blob whose security decays over time, and needs ongoing vigilance to maintain. (e.g. don't be careless with these blobs, don't leave them on a public server for analysis as people may attempt to decrypt it)

I'm interested in any thought around

  • If this is a factor in deciding to use a private hardwire connection instead of a VPN
  • Examples and methods to gauge the time sensitivity of information, and exposure risk
  • Risks and remediations within the IT Security realm
  • Ways to communicate the risk (and relevant remediation) to non-technical folk. (e.g. for funding)
  • ... ?
makerofthings7
  • 50,090
  • 54
  • 250
  • 536
  • 2
    Interesting idea, I guess this is kind of a security "half-life" for crypto... I'm curious what will turn up here. – AviD Mar 22 '11 at 16:17
  • I didn't realise there were businesses that didn't view it as crypto half life... how weird – Rory Alsop Mar 22 '11 at 16:48
  • 1
    @Rory I meant the term, and as trying to specify the length thereof. As opposed to just a general "it's not infallible and will eventually break" – AviD Mar 22 '11 at 18:03

2 Answers2

20

One notion is called Perfect Forward Secrecy. This applies to situations where you encrypt data and decrypt it almost simultaneously, but you worry about an attacker who would later on obtain a copy of the decryption key. This is a restrictive model, but it applies to SSL connections: the server private key is long-lived (usually, it is stored in a local file) and thus subject to future theft; whereas there is no legitimate need to decrypt exchanged data at a future date.

In the case of SSL, the solution relies on the "DHE" (Diffie-Hellman Ephemeral) cipher suites. Roughly speaking, client and server use a Diffie-Hellman key agreement, with fresh private exponents, to establish the session keys. The server private key (the one corresponding to the public key in the server certificate) is used only for signing (so that the server is duly authenticated by the client). Therefore, ulterior theft of the server private key does not allow the attacker to decrypt data which was exchanged beforehand. The Diffie-Hellman private exponents and the session key, being transient (linked to that session only), are never stored on a physical medium, and they are forgotten once the session is closed; therefore, they are much less susceptible to ulterior theft.

Moore's law and overall increase in available computing power is not a big issue, because it is predictable: such computing power increases by a factor of less than 2 every year. Therefore, it is easy to oversize keys a bit to account for that factor: in the case of symmetric encryption, just add one key bit per year. This means that AES with a 192-bit key will be fine for at least one century, at least with regards to such technological advances. For RSA, Diffie-Hellman, Rabin-Williams or El-Gamal, aim at 8192-bit keys for the same protection level (384-bit key for elliptic curves).

More worrisome are scientific advances, namely potential discovery of faster algorithms for, e.g., integer factorization. Such advances are much less predictable than technological advances; however, they also seem to happen quite rarely (the last big advance in factorization was the invention of the Number Field Sieve, and that was 20 years ago). The quantum computer is an unknown joker here: if it can ever be built, then it utterly destroys asymmetric cryptography (well, at least, the factorization-based and discrete-log based algorithms, including the elliptic curve variants; possibly, McElliece encryption may resist).

Generally speaking, the biggest threat for long-term confidentiality of encrypted data is private key theft. When applicable, PFS really improves things by a fair margin. Worrying about Moore's law or quantum computing is very good news indeed: it means that you have already thwarted all easier attack vectors, which is no small achievement. Thinking that "encryption is an infallible lockbox" is not completely preposterous: if done properly, the encryption part itself will add negligible risks to what you must already face when it comes to storing a private key and keeping it safe while not losing it altogether.

Thomas Pornin
  • 320,799
  • 57
  • 780
  • 949
  • So is it true that if the session key was swapped out to disk on either client or server, or the Diffie-Hellman exponents were swapped out on both sides, and later recovered by the attacker you'd be in trouble? Of course that isn't very likely, but just supposing? – nealmcb Mar 22 '11 at 19:07
  • yes, data in SSL is encrypted (symmetrically) with a session key. The key exchange (RSA encryption or Diffie-Hellman) results in a "pre-master secret" which is then derived into symmetric encryption and MAC keys. If the attacker can obtain a copy of the key exchange private key or the pre-master secret or the symmetric encryption keys, then he can decrypt whatever was encrypted with the symmetric encryption keys, i.e. the data in the tunnel. – Thomas Pornin Mar 22 '11 at 22:03
  • 1
    I suspect that some technological advances could be faster than a factor of 2 per year. Specialized hardware can decrease the time to find a solution by orders of magnitude. IIRC, that was how one or more of the DES challenges was won. Particularly I wonder about GPUs being used for this purpose, as well as cloud services. – user1971 Apr 20 '11 at 14:01
  • @ThomasPornin, Does using a longer key/bit-length guard against quantum computers? – Pacerier Jun 09 '14 at 21:54
  • 1
    Quantum Computers are a bit like unicorns: as long as they don't exist, they can do "anything". As far as theory goes, bigger keys help against QC for symmetrical algorithms (AES...) but not for the asymmetrical algorithms that QC break thoroughly (RSA, DSA, ECDSA, DH...). – Thomas Pornin Jun 10 '14 at 03:03
8

The practical upshot of this is that almost all the security departments I work with treat encryption as something which they expect to protect for a length of time and their decisions on what encryption to use are based on how long they need data to be 'secure'

Admittedly the factors which feed into this are woolly, and are prone to change as someone discovers a new theoretical attack, but broadly speaking I approve of this approach.

I don't think I know anyone in global orgs dealing with encryption who doesn't look at it like this. They change keys based on an expected brute force of half the key space (or less) and change algorithms when new ones are recommended by governments.

Some updates:

Quite a lot of encrypted connections my clients use are only important at the time - if they are broken a month later it doesn't matter. These types of connections typically use high performance crypto, not crypto that will last years.

Crypto used to protect private medical data, on the other hand, has to be strong for the life of the patient at a minimum, and performance is less important, so much stronger crypto is used.

It's all about appropriateness.

Rory Alsop
  • 61,367
  • 12
  • 115
  • 320