0

What happens if for some reason a cell phones clock / calendar is off by a significant amount of time? Does the TOTP (Time-based OTP) algorithm generate an invalid token? Also, do time zones play a role in the token being correct or does both the client and the server talk to a Network Time Protocol server to ensure that everything is synced up?

So in other words if we wanted to bullet proof the design of the server to be sensitive to the above scenario how would we go about doing this? For instance suppose there was a 5 seconds differential between server time and client time and my grand mother was typing the the code on her phone, and just taking a long time. How would be assess the amount of buffer time the user has to type the code and whether it's correct or not.

Also how are all clients ensuring that their timestamp starts at the same time that the server epoch instant starts?

Ole
  • 529
  • 5
  • 10
  • 4
    have you done any research: https://en.wikipedia.org/wiki/Time-based_One-time_Password_Algorithm "For this to work, the clocks of the user's device and the server need to be roughly synchronized" and then "TC = floor((unixtime(now) − unixtime(T0)) / TS)," – schroeder Nov 09 '17 at 23:04
  • Yes I'm reading up on how to implement it right now - just confirming simple details as I'm going along. Just trying to understand how the server and the client should see time and perform the calculation without any glitches. How does for example Authy ensure that the timestamp on the cell phone is the same as the timestamp on the server? Does each just calculate the current instant in milliseconds and everything just works from there? – Ole Nov 09 '17 at 23:19
  • 2
    it's in the wiki – schroeder Nov 09 '17 at 23:21
  • This is what your link says about my question: Because TOTP devices have batteries that go flat, clocks that can de-sync, and because software versions are on phones that users can lose or have stolen, all real-world implementations have methods to bypass the protection (e.g.: printed codes, email-resets, etc.), which can cause a considerable support burden for large user-bases, and also gives fraudulent users additional vectors to exploit. – Ole Nov 09 '17 at 23:42
  • I cannot use that to calculate parameters around how sensitive the servers authentication check might be to deviations between the server clock and the client clock ... that question is also asking about implementation of time stamps on different systems and how those time stamps are generated. – Ole Nov 09 '17 at 23:44
  • So in other words how do you bullet proof the implementation around the weaknesses of the design of the algorithm ... – Ole Nov 09 '17 at 23:45
  • You didn't ask any of those things, though. As I say, please edit your question to include the deeper subtleties – schroeder Nov 09 '17 at 23:46
  • Done - It's a little longer than I wanted it to be now - I can probably go into more detail, but you seem like a really smart person (Top 0.15% - probably near security genius), and I just figured you would see these details in the question as is. – Ole Nov 09 '17 at 23:52
  • 1
    What I can see and what you need me to see can be different things. That's why adding context is important (regardless of the length of the question). Also, your more detailed question is explained in the wiki, still: "TOTP codes are valid for longer than the amount of time they show on the screen (usually two or more times longer). This is a concession that the authenticating and authenticated sides' clocks can be skewed by a large margin." – schroeder Nov 10 '17 at 10:19

2 Answers2

6

Does the TOTP Algorithm rely on the client time always being synced correctly?

Yes

What happens if for some reason a cell phones clock / calendar is off by a significant amount of time? Does the TOTP (Time-based OTP) algorithm generate an invalid token?

They would be unable to authenticate correctly. This is not exactly an invalid token, just a token for a different time.

do time zones play a role in the token being correct

No

does both the client and the server talk to a Network Time Protocol server to ensure that everything is synced up?

No. A NTP server is a way to make any of them to get the right time, but your grandmother manually introducing the current date and time (and the device being configured in the right timezone) would work as well.

suppose there was a 5 seconds differential between server time and client time and my grand mother was typing the the code on her phone, and just taking a long time. How would be assess the amount of buffer time the user has to type the code and whether it's correct or not.

The server can be a bit lenient by accepting a few time steps around the right one (from the server POV), and using a not-too-small time step.

For example, with a Time Interval of 30 seconds, and accepting 3 TS above or below the server one, it gives your grandmother 1.5-2 minutes to type it. Increase it to 10 and you would give her 5 minutes… It's just a matter of determining for how long you want to accept them. But conversely, that allows a greater attack window to fraudulently use such code.

Regarding the clock skew, note that you could record the difference between expected and provided codes and extrapolate the expected skew from that. If at day 1 10:00, the user provides the code for 09:58, and the next day the code for 09:56, it could infer that it has a clock skew of 2 minutes per day, and adjust accordingly the server expectation for this user to old values that would otherwise not be used (properly resetting when the user clock is "fixed"). This calculation increases the complexity noticeably, though. Most of times, I don't think you would generally need to do that.

Also how are all clients ensuring that their timestamp starts at the same time that the server epoch instant starts?

The algorithm they both agree to use, set a given time from which the instant starts (typically the Unix epoch).

Ángel
  • 17,578
  • 3
  • 25
  • 60
6

It is a common misconception that the TOTP algorithm is anyhow involved with Google or the other way round. TOTP is based on the HOTP algorithm, that was published in 2005 in RFC 4226.

The TOTP algorithm replaces the counter of the HOTP algorithm with a 30 or 60 seconds time slice. It is defined in RFC 6238.

It is to be noted, that the idea of the xOTP algorithms was based long before there was the first smartphone around. No Google employee was involved in the specification but rather several people from token hardware vendors.

Google simply adopted the TOTP algorithm for android devices, for which it was not designed in the first place!

What happens if for some reason a cell phones clock / calendar is off by a significant amount of time? Does the TOTP (Time-based OTP) algorithm generate an invalid token?

No, not necessarily! Chapter 6 of the RFC recommends a resynchronization, i.e. the clock in the authentication device may drift and the authentication server should remember the clock drift. So over time the authentication device can have a signigicant offset, if the user uses the authentication device regularily. My own project the privacyIDEA authentication system takes care of such a clock drift.

Also, do time zones play a role in the token being correct or does both the client and the server talk to a Network Time Protocol server to ensure that everything is synced up?

As Ángel pointed out, there is no need to think of timezones, since the time slices are used based on the unix system time.

And as RFC 6238 was defined for hardware tokens there is also no recommendation to use NTP, since the hardware token does not have an internet connection but only a battery driven local quartz clock. Of course, if you implement your server, you should connect your server to ntp. And if you are running a smartphone app as authenticator, this should also use ntp.

So in other words if we wanted to bullet proof the design of the server to be sensitive to the above scenario how would we go about doing this? For instance suppose there was a 5 seconds differential between server time and client time and my grand mother was typing the the code on her phone, and just taking a long time. How would be assess the amount of buffer time the user has to type the code and whether it's correct or not.

This is defined in the RFC. 5 seconds does not matter, since it falls in the same timeslice. Your authentication can check a few time slices before the current time and after the current time and thus know, if the clock (or the user) is a bit slow or fast.

cornelinux
  • 1,993
  • 8
  • 11