15

When implementing password hashing using PBKDF2 for authenticating access to a REST api,when we say that PBKDF2 is slow does it mean that it's going to take a lot of time to hash the password and validate it, therefore the service not being responsive enough for the end user?

Or is it the case that is PBKDF2 slow only when the password given is not valid,not when the password is correct?

microwth
  • 2,101
  • 2
  • 14
  • 19
  • 4
    Keep in mind that the user is very likely to insert the correct password, so if PBKDF2 takes 0.25 seconds to check it is not an issue. The attacker will brute force (or use a dictionary) that requires thousands or millions of attempts and in that case 0.25 extra seconds per attempt means the attack will be way slower than using a "normal" hash. . – Bakuriu Mar 14 '21 at 10:53
  • 1
    @Bakuriu It is not an issue for the *client* but it IS one for the server(s). If you have 1000 RPS on your API and each request takes 0.25s for authentication only, that's 250 seconds of CPU time which won't be available for computing the actual response of the API. That's huge! This will likely require you to add additional computational power (servers) and it will cost you more money. Some kind of caching (like described by Jeff Ferland in his answer) is the way to go. – GuiTeK Mar 15 '21 at 15:59
  • @Bakuriu is correct, but note that they're talking about an offline attack where the attacker has the password hash database. Extra iterations are overkill for online REST attacks, you should instead throttle login attempts in the API implementation. – Fax Mar 16 '21 at 17:11

4 Answers4

57

PBKDF2 and other key stretching algorithms are meant to be slow and take the same amount of time whether the input password is correct or incorrect.

To reduce computational load and latency for your user, the API should authenticate once via login credentials and issue a revokable or time-limited session token that is verified by a simple lookup.

Jeff Ferland
  • 38,090
  • 9
  • 93
  • 171
  • 1
    Additionally using a session token allows implementing the authentication in a separate service, which may further be delegated to other party (think Microsoft or Google OpenID Connect). That way the API server won't even have access to the password database, adding a defence layer in case it is broken into. – Jan Hudec Mar 16 '21 at 18:41
10

PBKDF2, or for that matter, any password hashing algorithm, is designed to be tunable so it can take a variable amount of time, depending on the security level to be desired. The amount of time it takes will be constant assuming the same parameters are taken, whether the password is correct or incorrect, since it's impossible to determine whether it's correct or incorrect until the operation is complete.

Since a commonly recommended target amount of time for a password hashing algorithm is 100ms, you may find that such a speed has an impact on performance. However, typically APIs don't use hashed passwords for authentication. It is much more common to issue some sort of token to a user and have them use that for authentication instead. This is true for web interfaces as well, where we call that token a session cookie.

If you generate a token with at least 128 bits of entropy (256 is better), or you use a secure MAC or signature for your token, then this is generally a secure way to do authentication. You can then store your random token in the database, preferably hashed with something like HMAC-SHA-256 (with a key specific to your application), and then the cost of verifying the token is greatly diminished. Or, if you use a secure MAC or signature, you can just verify the MAC or signature and then check that the token's ID is still valid.

You should never store a password or other low-entropy secret with a simple hash or HMAC, but this is secure if you use a high-entropy secret like a token generated from a CSPRNG because brute-forcing it should be impossible.

Note that PBKDF2 is no longer a recommended algorithm for password hashing. Memory-hard algorithms such as Argon2 or scrypt are preferred by most cryptographers, although government standards are unfortunately slow to adapt.

bk2204
  • 7,828
  • 16
  • 15
  • Do you have a source for PBKDF2 not being recommended anymore? From NIST it still seems like that's the way to go – Hugo Dozois Mar 15 '21 at 17:59
  • 1
    I don't have a readily available source other than things I've read from cryptographers. PBKDF2 is recommended by NIST, but it is not memory hard, meaning that GPUs or ASICs can speed up the operation significantly, whereas memory hard functions make it much more difficult. NIST is behind the times with PBKDF2, much like they're behind the times by not supporting ChaCha20. – bk2204 Mar 15 '21 at 18:06
6

when we say that PBKDF2 is slow does it mean that it's going to take a lot of time to hash the password and validate it, therefore the service not being responsive enough for the end user?

It won't add any meaningful latency to your REST API. A computation that takes 100ms once will not be noticeable to your clients, but will be prohibitively slow for someone who is attempting to perform billions of password cracking attempts on stolen database hashes. All this algorithm does is reduce the speed of password cracking from potentially billions of tries per second to thousands or less.

Or is it the case that is PBKDF2 slow only when the password given is not valid,not when the password is correct?

No. It will take exactly as long whether or not the password is incorrect. In fact, the entire PBKDF2 computation must be completed before the system can determine if the password is correct.

forest
  • 64,616
  • 20
  • 206
  • 257
  • This totally depends on the usecase, if the API call is high frequency and supposed to be processed in 10ms it will slow it down 10 times. In that case you can either cache the result (which keeps the passwords in memory, so maybe hash them at least with a single iteration as the key) or avoid a strong password hash (which is fine if you know the passwords are high entropy, randomly created without reuse). Besides that working with topere or good old fashioned http session cookies is also a possible optimization. – eckes Mar 14 '21 at 14:41
  • 1
    @eckes: you misunderstood how the “takes 100ms once will not be noticeable to your clients” works. For an API the password is sent once for verification, and a token is returned that will be used on all subsequent calls until the token expires. It would in fact be reasonable to throttle the verification to a second or even more, because if the verification time is considered a bottleneck the solution isn’t to speed up the verification but to allow overlapping token expiration periods and let the client juggle the switch. As an API, the password should never be entered incorrectly. – jmoreno Mar 14 '21 at 21:00
  • Rest apIs which do not use sessions or tokens are not that seldom. If you have no such api then it’s fine, if you do, my comment contains multiple ways around it. Especially API keys don’t need a 100ms PBKDF hash. – eckes Mar 14 '21 at 21:03
1

PBKDF2 is commonly implemented as 2048 iterations of hashing, to slow down password crackers who have acquired a list of hashes. It's slow by design. The number of iterations is configurable, but would you want to make it less secure for the sake of speed?

To diverge from the theme of the thread, isn't it time to stop using passwords and session tokens, use PKI for authentication instead. API access to Web resources is becoming more and more common. HTTPS has supported PKI authentication using client certificates for about 20 years. Why not adopt it?

  • 1
    PBKDF2 is a little more complex than just iterations of hashing (and 2048 is way too low anyway). – forest Mar 15 '21 at 00:50
  • PKI have terrible UX: last time I used a service which used it was a mess dealing with client certificates! How you expect people to deal with client certificates on their phones? Something similar, far from perfect, but with better UX, is WebAuthn. – Gustavo Rodrigues Mar 15 '21 at 16:04
  • @Gustavo The UX can be fixed. See how Bitcoin creates new keypairs and signs transactions without the user even being aware that private keys exist, without being able to view them – Louis Thompson Mar 17 '21 at 21:19