1

As I was unable to find any thread about this particular question, I'm trying to ask the community for help.

We're currently using RSA RADIUS based 2FA to authenticate external VPN users from companies to let them manage their systems inside our corporate network. The support contract for this solution will not be extended from management, so we're forced to look out for alternatives.

One of our favorites that stood out so far is LinOTP, as it offers a great variety of different token types, has a per-user subscription based license model and can manage multiple UserIdResolvers aka. realms.

One of the things we're confronted with is that external users seem to be handing over their personalized RSA tokens to their colleagues to let them use them while they're e.g. on holiday even though we've let them sign a user agreement that prohibits exactly that.

As cheap and easy generating TOTP/HOTP based 2FA tokens seem to be, for me the biggest downside is the fact that the external user could easily "share" the QR code/seed that he's received from us by email with his colleagues. This in fact is even more "dangerous" than what we use right now, as the RSA token cannot be duplicated.

One other software token type I've found that is also supported by LinOTP is "mOTP". In turn to TOTP the user has to specify a random seed + PIN that he'll provide us instead of us providing something to him. At first glance this looked exactly like what we were looking for. Unfortunately I soon realized that user B could easily impersonate himself as user A by just specifying the same seed + PIN. So the uniqueness is also not given but at least it's slightly better in my opinion than TOTP/HOTP.

During my search on the mOTP Sourceforge page I found an iOS client application that might do what we want to have: "Mobile-OTP for the iPhone with optional UDID security extension". Here I assume the random seed will be hashed against an unique device identifier that would make it at the very least "harder" to impersonate as someone else and verify that this user is verified to be himself.

So I think my main questions are:

  • are there any known mechanisms to prevent duplication of software tokens / guarantee the "uniqueness" of the software-token? (e.g. QR code only valid for X seconds)
  • are there any known mechanisms to ensure the identity of the user trying to login? (besides regular username, password and the token)

Things I might mention at this point:

  • a bastion host was considered and still is, but should not be the scope for this discussion
  • Cisco DUO was taken into consideration too but turns out to be quite costy, at least compared to LinOTP (3$/u/m vs. 1$/u/m)
alphachris
  • 19
  • 4

3 Answers3

3

You cannot solve this with technology. It's a policy problem, not a tech one.

Even if you use a hardware token, nothing stops them to call the colleague and get a token. If you have a biometric-backed generator, or whatever, they will find a way to share. Having Human Resources step in and having a strong stance on this matter will solve it.

Remind everyone that sharing OTP tokens are a fire-able offense, and if (when) someone breaks this rule, fire him and summon a company-wide meeting after that to tell everyone that Mr OTP-Sharer was caught sharing his token and was fired.

schroeder
  • 123,438
  • 55
  • 284
  • 319
ThoriumBR
  • 50,648
  • 13
  • 127
  • 142
  • Except your idea to fire the "guilty" user, this is the same conclusion that I came to after overthinking everything again. Unfortunately we're not talking about internal users of our company who are violating the policy but external users that take care of "their" systems in our corporate network - which scares me even more than having an internal user violating the policy, because for the internal user it's more easier to act upon. – alphachris May 24 '20 at 10:19
  • But wouldn't it be ok for you, if an external company with two users managing their system in your network, sharing *one* hardware token. If each of the two external users is using his own password with the shared hardware token? – cornelinux May 24 '20 at 11:38
  • So this is a X-Y problem. Your problem is that external users are working on an unsafe way and threatening your internal network. Isolate the servers of external users, and their servers, their problems. – ThoriumBR May 24 '20 at 13:17
1

Personal Value of MFA Devices

To minimize the risk of MFA device sharing, the MFA device should be of personal value to the user. It seems like a user's smartphone is the most personal device available, as it stores private photos, messages and other personal information that users want to keep private. Also, almost everybody has a smartphone nowadays. This is especially true for people who work in IT. Hardware tokens that only generate one-time passcodes provide no personal value to users if they are issued by their employer and are meant to be used only for work. This provides an incentive to share them among coworkers. Hence, we should focus on using smartphones as MFA devices.

Software Tokens

As smartphones usually don't come with built-in MFA authenticators, users have to install apps that provide this functionality. Apps are software, which means that we'd be using software tokens. It is easier to duplicate a software token than a hardware token. For this reason, security folks usually prefer hardware tokens that are based on FIDO authentication protocols. During the last few years, U2F security keys became very popular. Google reported that they neutralized employee phishing by using U2F keys. Unfortunately, U2F as well as newer WebAuthn security keys are by design not able to be uniquely identified for privacy reasons. This means that users can share these security keys anonymously, as the same key will identify differently during each key enrollment. This means that we must focus on software tokens in the form of mobile apps installed on smartphones.

Popular MFA Methods for Software Tokens

Mobile apps that act as authenticators are able to provide several authentication methods. The most established methods are one-time passcodes (TOTP or HOTP), which are well known from Google Authenticator, and push notifications, which are implemented by MFA vendors in their authenticator apps. Let's explore if these MFA methods can help us reduce the risk of MFA device sharing.

TOTP and HOTP

Both TOTP and HOTP are RFC standards that require users to enroll their device by scanning a QR code. This QR code can be copied and shared. Multiple mobile devices may be enrolled using the same QR code. The party issuing that QR code has no way to identify the devices that are enrolled this way, as there is no programmatic two-way communication between the device and that party. The value of generic TOTP and HOTP with regard to reducing the risk of MFA device sharing is questionable.

Mobile Push

Mobile push authentications are push notifications sent to a user's mobile device that enables them to deny or approve login requests by tapping the corresponding button. They offer a fast and easy user experience and have been recommended by Gartner as they provide “better user experience and lower cost than legacy 2FA methods”. There is no RFC standard for implementing mobile push authentication, hence every vendor provides their own solution, usually together with their own mobile app. This enables vendors to develop custom approaches that take MFA device sharing into account. It looks like mobile push authentication, which is enabled by push notifications on iOS and Android, are the right direction.

How to Reduce Risk of MFA Device Sharing

To reduce the risk of MFA device sharing, the authenticator app that provides push authentication should be limited to only one mobile device per user. Otherwise, the user could also install it on a non-personal mobile device that is shared between coworkers. Several nuances have to be taken into account:

  1. What if a user loses their smartphones and needs to install the authenticator app on their new phone? That should be made possible, but the authenticator app on the user's previous device should be deactivated.
  2. How is the user verified when they install the authenticator app on a new device? If this is handled loosely, then an attacker may take over the user's authenticator app by installing it on the attacker's device. The installation on a new device could be approved by the user, e.g. by contacting them through a different channel (e.g. email or phone). Alternatively, such requests could be done by an admin who is able to verify the requesting user, e.g. in person, by phone or by email. The MFA solution should inform the admin if that mobile device has been previously used by that or any other user. Having this information, the admin could see if an attempt to MFA device sharing is taking place and react accordingly.

It seems that using a multi-factor authentication product that takes the above into account would minimize the risk of MFA device sharing.

Implementation Example: Rublon

As I work at Rublon, I'm able to tell you about an implementation example of the above proposal. Rublon's multi-factor authentication allows you to enforce users to use the Mobile Push MFA method that is provided by the Rublon Authenticator mobile app. By policy, users can install Rublon Authenticator on only one mobile device. If a user installs it on a new mobile device, then it stops working on their previous mobile device.

Right now users may install Rublon Authenticator on a new device without an administrator's approval. A policy that will allow admins to enforce such an approval will become available soon. Such an approval request will tell you if that new mobile device has been previously used by this or any other user for Rublon Authenticator. This will allow you to question the user's intentions and ask for clarification.

multithr3at3d
  • 12,355
  • 3
  • 29
  • 42
0

As @ThoriumBR pointed out, at the end of the day, users can always exchange their hardware. So the question is, which hardware will they probably not exchange - probably their smartphone?

But with the smartphone, you have the problem, as you already mentioned, that the cryptographic key, which makes up the 2nd factor, could be copied. I wrote a more detailed wrap up some years ago.

The best thing would be to actually have the smartphone generate its own private key. The mentioned UUID is a good approach.

Disclaimer: I am deeply involved in developing the solution I am about to mention. My company actually provides services for this open-source solution. So if you are offended by this, stop reading!

You might want to take a look at privacyIDEA which started as a fork of LinOTP about 6 years ago. So you might realize some familiar aspects like the resolvers, but it developed in a quite different way at Github.

privacyIDEA provides two token types, which might be of interest to you.

2step enrollment

With HOTP and TOTP smartphone tokens privacyIDEA adds a "2step-enrollment", where the smartphone generates a client part of the shared secret and it is transferred to the privacyIDEA server. The resulting secret is calculated from the server part and the client part. Each smartphone would generate its own client part, so there is no easy way of copying the token. (However, a user could write down server and client part and manually calculate the resulting secret - but again, this would be a policy thing) You can find more about this in the online documentation. There is an app that supports two-step for Android and iOS.

push token

The other possibility would be to not do OTP. Since OTP is meant to be entered manually you can only use a symmetric key as the second factor. If you avoid the manually 2nd-factor input, you can also use asymmetric keys. With the push token, this would be possible, since during enrollment the push token generates a unique key pair on the smartphone and the private key does not leave the smartphone. However, a user could still exchange his smartphone (very unlikely).

Also the push functionality comes with several infrastructural challenges. So I would recommend taking a look at the 2step, first.

You can find out more about push, here.

I hope this also helps from a conceptual standpoint.

schroeder
  • 123,438
  • 55
  • 284
  • 319
cornelinux
  • 1,993
  • 8
  • 11
  • Thank you for your detailed write-up! We'll definately look into privacyIDEA! After digging further into our use-case scenario and thinking about the psychological aspect, I'm considering to generate OTP's by email for external users. Although it's not as secure, it's *very* unlikely the external user will share his email passwort with his colleagues or reply to questions from his colleagues while he's absent from work (e.g. holiday, day off, ..). This would even be the most easiest way for our use-case, as external parties might not want to carry OTP's on their mobile phones with them. – alphachris May 24 '20 at 10:34
  • I totally get the needs for OTP via emails some times. You might also consider sending OTPs via SMS/text messages. :-/ Both is also supported by privacyIDEA - so you can have every 2nd factor (week or strong) in one place. – cornelinux May 24 '20 at 11:36