Is a computer really drifting more than a second in a single day? I can barely imagine that.
Yes, some do. Especially in virtualized environments. Running inside a virtual Machine (as it is the case on Azure) invalidates some fundamental assumptions that are used to keep time, making it even more difficult than on physical hardware. A modern computer is a lot more complex then a digital clock.
Why not a sync every day to a time server. This should have minimal negative impact compared to the advantages of a (relative) consistent clock among servers.
That's what people do. It's called NTP, but even that is more difficult then you might imagine, as some services need continuous time so jumping time forward or backward to adjust a clock is not acceptable and you end up slowing down or speeding up the system clock, which takes time.
My gut feeling would be that the biggest problem would be latency, but even that would be just a few ms after every sync.
NTP tries to adjust for latency already, and there are improved variants with even lower latency.