5

Related are-there-any-working-proof-of-concept-string-comparison-timing-attacks

I was looking at doing some encryption and hashing in PHP and came across this note

Please be careful when comparing hashes. In certain cases, information can be leaked by using a timing attack. It takes advantage of the == operator only comparing until it finds a difference in the two strings.

It seems to me that in most cases the difference in comparing two strings is only going to be a few clock cycles. Given the overhead in HTTP, PHP and changes in network latency, there will be a lot of variation in the time taken to process a request. I don't see that using such an attack over the public internet.

Has such an attack ever been demonstrated in a real-world situation?

I'm not saying that you shouldn't protect against it just in case, I am just wondering how big the risk actually is.

Jeremy French
  • 537
  • 5
  • 12
  • So the hypothetical hack here is that they send over a few million requests, eventually figure out your hash, and then do an offline attack to compute the password that generates that hash? Why not just spend those same millions of requests, just actually trying different passwords? – Jason Coyne Jul 20 '15 at 19:33

2 Answers2

6

With enough requests made it is possible to carry out statistical analysis of the results. Check out ircmaxell's blog post:

It's been shown that you can remotely detect differences in time down to about 15 nanoseconds using a sample size of about 49,000

Analysing a large number of requests will iron out any network or application latency issues.

SilverlightFox
  • 33,408
  • 6
  • 67
  • 178
  • 2
    The blog post says 15ns, but the linked article says 15µs over the net, 100ns on a LAN. – Brian Hempel May 12 '15 at 15:00
  • 4
    Having read through the attached paper. Most of it's conclusions are lan related. Also difference in time for string comparison will be only a few clock cycles – Jeremy French May 13 '15 at 21:15
2

The problem is that the overhead generally averages out. So if you check one string a million times and another string another million times then you would still be able to detect differences in the compare. The higher the amount of tries possible, the higher the chance that even minute changes can be detected. Sometimes it makes sense to limit the amount of tries, e.g. by introducing a time delay - if a higher latency is not an issue.

Maarten Bodewes
  • 4,562
  • 15
  • 29