15

A 256 bit AES key is required to be broken using the brute force method on a 2GHz computer. How long would it take to break the key in the best case and in the worst case situations? Assume that 1000 clock cycles are required to check a single AES key.

Nadishan
  • 253
  • 1
  • 2
  • 4
  • 11
    Beware! Your professor knows about *Information Security* and is reading this. – Bob Brown Feb 24 '15 at 13:14
  • May I ask on what the assumption is based that 1000 clock cycles are required to check one AES key? Is this a valid real-world assumption? I have no idea, just wondering. – Michael Feb 24 '15 at 13:52
  • @Michael: It's a homework problem. The assumption was provided by the professor. – Bob Brown Feb 24 '15 at 15:12
  • @BobBrown Yes, but it would be interesting to know if it is a valid assumption, otherwise this question is of not much value.. – Michael Feb 24 '15 at 16:49
  • 1
    @Michael: It is far too low. A computer with a 2GHz clock is also likely pipelined, so it completes roughly one instruction per clock cycle. That means you have 1,000 machine instructions to pick the next key to try (which may be as simple as adding one to a predecessor), decrypting a minimum of one block of data, then somehow testing whether the result of decryption was successful. I haven't really worked this out, but I'd bet that one can't compute even one round of AES for a 256 bit key in 1,000 instructions. Instead, 1,000 was picked by the professor to make the computation easier. – Bob Brown Feb 24 '15 at 21:04
  • Related: [Amount of simple operations that is safely out of reach for all humanity?](https://security.stackexchange.com/q/6141/2138) – user Jun 15 '16 at 09:09

4 Answers4

23

One of my favourite gems on encryption is from Bruce Schneier in his book Applied Cryptography.

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38×10-16 erg/K, and that the ambient temperature of the universe is 3.2 K, an ideal computer running at 3.2 K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our Sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.

But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

See also https://www.schneier.com/blog/archives/2009/09/the_doghouse_cr.html

Josef
  • 5,903
  • 25
  • 33
Qwerky
  • 721
  • 3
  • 10
  • this was indeed a gem. thanks for sharing. – Kyle Baker Nov 29 '21 at 02:43
  • [Reversible computing](https://en.wikipedia.org/wiki/Reversible_computing) or 'adiabatic computation' is a theoretical way to beat the von Neumann limit, so it's not *strictly* required to burn a sun's worth of energy to count to 2**256. – Nick T Aug 12 '22 at 01:37
17

Well, using simple math: If checking one key takes 1000 clock cycles, and the computer has 2,000,000,000 cycles per second, it checks 2 million keys per second. The best case is that the first key you try is correct: total time is half a microsecond. The worst case is that the last key you try is correct: you have 2256 keys divided by around 221 checked a second (that's more like 2.1 million, but close enough), which is 2235 seconds, which according to Wolfram Alpha is around 1.75 vigintillion (that's 1.75*1063) years, or around 1.3*1053 times the age of the universe.

If running your computer for as long as the universe has existed lets you check A keys (A is absurdly huge), and a magical computer that can check A keys per second (this is running a desktop computer for 14 billion years every second) running for as long as the universe has existed would have checked B keys in that time, then if you happen to have a super-duper magical computer that checks B keys per second (keep in mind that this is as fast as a desktop running every second for 14 billion years, per second) and has been running since the Big Bang, you would only be around 68% done with your brute-force.

To put it another way: The Sun will die out in a paltry 5*109 years. In that time, the ratio of the progress you've made to the total amount of work you have to do is within a couple orders of magnitude of the ratio of the mass of one hydrogen atom to the mass of the supermassive black hole at the center of the galaxy. However, Wikipedia lists the heat death of the universe as occurring at earliest in 10100 years, so you will crack it by then.

The average case is half of the worst case.

Trang Oul
  • 124
  • 1
  • 8
cpast
  • 7,223
  • 1
  • 29
  • 35
  • 14
    "The best case is that the first key you try is correct" - *How did the hackers get in, sergeant?* *Absurdly good luck, sir* – PyRulez May 08 '16 at 17:09
3

This sounds like homework to me, but try WolframAlpha;

Worst case, you have to try all 2^256 possible values it takes 1836000000000000000000000000000000000000000000000000000000000000 years

Best Case, you try one value and it's correct it takes 0.0000005 seconds.

Hybrid
  • 4,178
  • 2
  • 21
  • 23
-2

The important thing about information security is to think laterally before your foe does.

So, use that CPU time to mine bitcoins, or whatever makes money using a CPU. Do whatever it is, enough that you can get another CPU in 18 months. Buy that other CPU. Rinse, repeat.

Let's go with the 2^235 CPU-seconds from another post. That's 2^209 18-month Moore's law cycles (not a safe assumption that the law holds that long, but people have been claiming the law was dead since the 90s and it's still going, so let's go with it...), then in 235*18 months, we'd crack that in a CPU-second. But we don't need to wait that long. There's a concept called the "Moore breakeven point", where the time-saving advantage of waiting another 18 months is less than 18 months. I think that's only 210*18 months away.

So, overall, that's just 315 years.

But that thing we're doing with the bitcoin mining. Don't buy another CPU with it. Don't even buy the first computer. Save that $1,000 or so. Instead, just invest it in something that gives you 5% over the current interest rate.

At the end of those 315 years, after investing $1000 at 5% annual, compounded monthly, you'll have the equivalent of... $4.7 billion. Woah! That's enough to buy 4,700,000 PCs! If you also invest the money you're saving in electricity by not running a desktop, about $20/month, you end up with $27Bn. So this is another, smaller log factor to take into account, as well as Moore's law: at what point, investing your initial money, plus $20/month, is it worth cashing out and buying the best network of computers you can, to crack the hash fastest?

Dewi Morgan
  • 1,340
  • 7
  • 14
  • 1
    This doesn't really answer the question. You seem to doing a cost-benefit analysis of Bitcoin mining. – forest Apr 13 '18 at 05:32
  • 1
    @forest The answer was "just 315 years", but this was just an illustrative ballpark. An excellent literal answer was already given and accepted. The meta-answer I'm illustrating here instead is: it's bad practice to create or use systems which rely on encryption remaining inviolate for more than a few years based on the abilities of current tech. Advances in computing and math invalidate any estimate past a few decades at a very optimistic best. – Dewi Morgan Apr 13 '18 at 22:06