33

I've been reading about 2FA and how it is used, and the thing that struck me most is that everyone seems to be storing the TOTP secret as plaintext in their database.

I understand you need the secret as plaintext in order to verify the OTP, so you can't hash it, but couldn't you at least encrypt it in a certain way, in case your database gets leaked? I feel like there are ways to make this a bit more secure but I'm wondering if there are downfalls to the ideas I have.

  1. Encrypt the TOTP secret using key stored on the server: This wouldn't be ideal since you'd be encrypting everything with the same key, but a database can be leaked without the attacker having full access to all files on the server. In that scenario, at least the TOTP secrets are still protected.
  2. Encrypt the TOTP secret using the user password: When the user logs in and the password hash check is valid, the same password that was sent could be used as a key to encrypt/decrypt the TOTP secret. When the user first sets up his 2FA, you require that they enter their password so we can use it to encrypt the TOTP secret. When they login with a correct password, you decrypt the TOTP secret using the password, and you validate the OTP. If your database is leaked, password is hashed and TOTP secret is encrypted, so the attacker has no information on any account unless they know the password.
  3. Combine both methods: Combine the password with a locally stored key, and use this as the encryption key for the TOTP secret. This way, if the database is leaked and the attacker knows the password, they still can't decrypt the TOTP secret unless they have access to the stored key

Are there any flaws in this, and if so, what are they? I feel like anything would be better than just storing the TOTP as plaintext.

Jeff Atwood
  • 4,542
  • 6
  • 25
  • 29
Y0lk
  • 431
  • 1
  • 4
  • 5
  • 4
    For #2 you might have a usability problem - if a person resets their password, they'd have to re-generate their TOTP secret as well. This is not a show-stopper, but adds extra steps on the users' side. – ralien Mar 09 '18 at 14:12
  • You've already highlighted the issue with #1. #2 is predicated on the 2nd factor being re-encrypted when the user changes their password - but often these are different, unlinked systems (and 2nd F is often invoked via radius which doesn't have a means for changing an encryption key / custom attribute). #3 is predicated on the #1 (which you've already highlighted has issues) and #2 which has the problems I mentioned. So really your question is "Why can't 2nd factor secret be encrypted using 1st factor" – symcbean Mar 09 '18 at 16:31
  • That's actually a good way to word the question. - Resetting your password means you'd have to reset the 2FA, but assuming you have control over it (such as if you store the secret in a simple database), I don't see that as being too much of a problem. - #1 is a problem by itself, but if you encrypt it using the 1st factor as well, then what's the issue? For example you could encrypt the secret using 1st factor as the key, and then encrypt that ciphertext using a locally stored key. – Y0lk Mar 09 '18 at 17:00

1 Answers1

5

Where have you been reading about 2FA and why do you assume that everyone stores the secret keys in plain text?

As you are talking of TOTP you should probably read RFC4226 and RFC6238.

Yes, e.g. the Google PAM module stores secrets in plain text in the users home directory. But please note: The HOTP algorithm was published in 2005, the iPhone 1 was published in 2007. We can simply deduce that the HOTP algorithm was not ment for smartphones - neither the TOTP algorithm, which was published even later.

The problem with encryption or plain text gets immanent, if you are using HOTP/TOTP to authenticate on a local machine. But the OTP algorithms are designed to be used with a trusted backend system. A local machine of course can not be trusted, as an attacker might sit in front of this very machine.

Take a look at section 7.5 Management of Shared Secrets, RFC4226:

Here it is recommend to encrypt the shared secrets, just like you suggested in #1. And of course if someone gets the encryption key, the authentication is useless. But this is a backend system, you need to protect. E.g. our (disclaimer!) open source software privacyIDEA actually encrypts the shared secrets and if you want to even with a hardware security module. (Again - this is a backend system)

You might also want to read a bit more about the flaws of the Google Authenticator design: https://netknights.it/en/the-problem-with-the-google-authenticator/

cornelinux
  • 1,993
  • 8
  • 11
  • _the HOTP algorithm was not meant for smartphones - neither the TOTP algorithm, which was published even later._ That's a very interesting point, does it also explain my doubt from [this question](https://security.stackexchange.com/q/188025/56343), that is, is this also the reason why a smartphone where the user enters the shared secret is still considered the "_something you have_" part instead of another "_something you know_" part just like the regular password? Because if the user enters the secret into the smartphone, even if from a QR code, that means the user _knows_ the secret. – SantiBailors Jun 21 '18 at 09:32
  • @SantiBailors Well if a user would choose an OTP seed like "TheNameofMyCatisPussy" this would result in a 20 byte hexstring "5468654e616d656f664d79436174697350757373". If you use this 160bit key for the HMAC-SHA1 in the TOTP algorithm, the attacker could "guess" this secret. This way your TOTP handy token would become some other kind of knowledge - instead of possession. – cornelinux Jun 21 '18 at 14:40
  • No, here I'm not talking about the user choosing the shared secret, I'm talking about the normal case where the secret is generated by the system and entered by the user into the device (usually phone), via a QR code or other ways. Since the user is the one who entered the secret, doesn't that make the secret something that the user _knows_? – SantiBailors Jun 21 '18 at 15:06
  • 1
    This is really difficult. The user could write down the OTP seed. And initialize a second and third smartphone with it. However, he could also print the QR code and enroll another smartphone. The concept of using TOTP for smartphone combined with the google define QR code enrollment is handy, but from a security standpoint it simply sucks. I do not know if you should call it a possession or a knowledge. It is "limited" security. See https://netknights.it/en/the-problem-with-the-google-authenticator/ – cornelinux Jun 21 '18 at 15:47
  • @cornelinux I don't think it's a security issue, it's a feature. Some people might need to be able to authenticate with multiple devices. The solution you proposed, TiQR-Token, has exactly same flaw, except that it's just more inconvenient to share the seed key if you need to. – FINDarkside Jul 27 '18 at 18:05
  • 1
    @FINDarkside If s.o. needs to authenticate with multiple devices, the system should treat each device as a unique second factor. Otherwise, if one device gets compromised you have to revoke all devices. privacyIDEA can handle multiple devices ;-) each with a unique secret key. To cope with copying of the secret key, we implemented our own app with a more secure enrollment process. https://netknights.it/en/produkte/privacyidea-authenticator-app/ – cornelinux Jul 28 '18 at 08:03