1

I'm just wondering, because it seems like it would. Cryptography as a field may have to start all over from the beginning.

user628544
  • 121
  • 3
  • I think that answers my question well enough. I will have to do a lot of reading on the things they talk about as most of it is stuff I've never heard of. – user628544 Aug 24 '16 at 16:23
  • @user628544, [Thomas Pornin's answer](http://security.stackexchange.com/questions/48022/what-kinds-of-encryption-are-not-breakable-via-quantum-computers/48027#48027) is a very good one. Feel free to post a new question if you find something that you are specifically unclear about. – 700 Software Aug 24 '16 at 16:47
  • Reports of PKI's death have been greatly exaggerated. – dandavis Aug 24 '16 at 21:09

2 Answers2

2

On a slightly different tack, Mike's response addresses the question of "could/can quantum computing destroy" some of our present encryption. This is slightly different than "will they". Basic quantum computers have been built and work. The challenge now lies in increasing the number of qubits.

In a recent breakthrough MIT researchers were able to use a quantum computer to calculate the factors of 15 with 5 qubits. Now 15 is a pretty big number. I can't even count that high without taking off one of my shoes. The world record is calculating the factors of 21. I can't count that high with both of my shoes off!

Kidding aside, the point here is that for a quantum computers to factor a large key, they need a lot of qubits to be in superposition. The thing is that the more qubits you try to keep together in that state, the harder it is to maintain. This is the reason that quantum mechanics is so strange to us because even the tiniest objects we deal with in the natural world are all made up of very large numbers of atoms.

It's kind of like juggling. A lot of people (e.g. me) can juggle three balls. Juggling four is way less common. Not something I can do. Juggling 5 is pretty extraordinary. 6, 7 we are talking amazing. 11 balls is the Guinness world record. Getting 5 qubits working together was a big deal. Getting 1025 in superposition? No one today knows if that's feasible and if it is, who's to say we aren't using 12288 bit encryption at that point?

Don't get me wrong. I'm not saying it won't happen. Perhaps there will be a breakthrough and the increasing difficulty of keeping more qubits in superposition goes away. I think most people who really understand this stuff and being honest will tell you it is still an 'if' and not a 'when' yet.

JimmyJames
  • 2,956
  • 2
  • 16
  • 25
  • I was at a talk at a recent quantum crypto conference all about the public misconception of using the size of the number factored as a way to measure progress on quantum computers. Current tech only allows for 1D arrays of qubits - which as you say has entanglement issues around 5 qubits. The are a few engineering problems blocking us from building 2D arrays of qubits, but once we get that we should scale up to thousands of entangled qubits very very quickly. – Mike Ounsworth Aug 24 '16 at 18:55
  • The other research efforts are around cooling, superconducting, and entrapment to dramatically bring down the power costs. Again, number of qubits and size of the number factored are bad ways to measure this. So yes, there's a lot of speculation around _when_ and _if_, but most of these research projects are one or two discoveries away from skyrocketing ... so I'm in the "when" camp. – Mike Ounsworth Aug 24 '16 at 18:57
  • @MikeOunsworth I may be reading 2D here too literally but are you saying that 10 X 1 qubits is hard but say 32 X 32 (assuming other problems are solved) is easy? – JimmyJames Aug 24 '16 at 19:01
  • @MikeOunsworth Can you explain how the number of qubits isn't important? Based on what I understand (I claim no expertise) you need lots of qubits to factor large numbers. – JimmyJames Aug 24 '16 at 19:06
  • Yes I am saying that 10 X 1 qubits is hard but say 32 X 32 (assuming other problems are solved) is easy. In a 1xn array each qubit has at most 2 neighbours to entangle with, in an nxn grid each qubit has 4 neighbours to entangle with, making it much easier to keep the qubits stable and to entangle much larger numbers of them. The experts I've talked to are convinced that once we have a stable 32x32, it'll be easy to put four of those together to make a 64x64, and four of those for a 128x128, etc up to the limit of our power and cooling capabilities. – Mike Ounsworth Aug 24 '16 at 20:05
  • Also, you're right that the number of qubits and the size of the number factored is the ultimate end-goal. Analogy: you want your kids to grow up to have good jobs and make lots of money, but judging your 9 year old by how much she makes at her paper route is a little premature: you know that'll jump once she graduates college. – Mike Ounsworth Aug 24 '16 at 20:17
  • @MikeOunsworth Again, I am no expert in this area but my understanding is that the more states you bring into superposition, the less likely it is to remain in superposition. So while what you are saying may be true, it seems to contradict that and in my cursory search I can't find anything that confirms what you are saying here. if you have a reference, I'd appreciate the link. – JimmyJames Aug 24 '16 at 20:24
  • It's all a bit weird, I admit. Consider that in a 1xn array, if a single qubit decohears, the entire chain is broken. In an nxn grid, if a single qubit goes down, the grid keeps its cohearence because it's still connected - and that qubit can be restored. Basically: nxn grids have a fault-tolerance that's not possible with 1xn arrays. This conference talk from 2014 is a bit old, but shows the concepts nicely: https://youtu.be/wWHAs--HA1c?list=PLBRgytHojT9au1KkDncCRwZ0igvOM-V8G – Mike Ounsworth Aug 24 '16 at 20:32
0

Short answer: symmetric ciphers like AES are fine. Hashes are fine. Anything involving public keys (RSA, Elliptic Curve, Diffie-Hellman, Signatures, etc) will have to be re-engineered from the ground up.

Mike Ounsworth
  • 57,707
  • 21
  • 150
  • 207
  • I guess that means the AES key exchange used by SSL/TLS will no longer be secure, and that protocol will have to be re-engineered. – user628544 Aug 24 '16 at 16:37
  • @user628544, That is correct. Fortunately Quantum Computers would be highly expensive initially. This would limit the potential exposure of sensitive data if/when a stable Quantum Computer is finally built. – 700 Software Aug 24 '16 at 16:48
  • Yup. Expert opinions are that we'll see the first quantum computers capable of breaking RSA-2048 around the year 2030, but they will cost about $1 billion USD mainly because of the dedicated nuclear power plant they will need to power them. Not commodity items. Unless you happen to hold military secrets, your PGP keys can probably stay on RSA well past 2030. – Mike Ounsworth Aug 24 '16 at 17:06
  • Actually not all symmetric ciphers are fine though. AES-128 for example is not post-quantum secure. – Yorick de Wid Aug 24 '16 at 17:12
  • @YorickdeWid Sure, and AES-32 (if it existed) wouldn't even be raspberry pi secure. Using existing algorithms with bigger keys will do the job - public key tech, on the other hand, needs a fundamental redesign. – Mike Ounsworth Aug 24 '16 at 17:25
  • @YorickdeWid: Could you please provide a link to the AES issue? – Mok-Kong Shen Aug 24 '16 at 17:36
  • 1
    The accepted answer in the dup addresses why AES needs its key sizes doubled. For more general reading, Google "Grover's Algorithm". – Mike Ounsworth Aug 24 '16 at 17:41
  • @Mok-KongShen see Grovers Algorithm – Yorick de Wid Aug 24 '16 at 17:45
  • @MikeOunsworth I'd agree, but AES-128 is very much NIST, 32 or anything other thatn 128,192,256 obviously not. I'd just pointed out you over-generalized it somewhat. – Yorick de Wid Aug 24 '16 at 17:48
  • @YorickdeWid Sure, sysadmins will need to change their TLS settings, or possibly purchase new hardware if it's doing hardware AES-128, but in the grand scheme of things that's "fine" next to the effort of developing and deploying brand new algorithms. – Mike Ounsworth Aug 24 '16 at 17:58
  • @MikeOunsworth Hell yeah. And we have quite some time before we get there :) – Yorick de Wid Aug 24 '16 at 18:36
  • @YorickdeWid: I looked at wiki on Grovers algorithm but failed (if I don't err) to see a specific statement saying that AES 128 is considered to be broken by it. – Mok-Kong Shen Aug 25 '16 at 16:49
  • @Mok-KongShen It has been discussed many time on security. For example http://security.stackexchange.com/a/48027/59575 – Yorick de Wid Aug 25 '16 at 16:54
  • @Mok-KongShen Yeah, it probably doesn't say it directly. The logic goes like this: cryptographers like things that have >= 128-bit security (ie requires ~2^128 guesses to crack). Grover's Algorithm halfs the effective key length, so it could crack AES-128 with only ~2^64 guesses, giving it only 64 bits of security against a quantum computer. 64 < 128, ergo AES-128 is not good enough against a quantum computer. That's really all there is to it. – Mike Ounsworth Aug 25 '16 at 17:21