Ask HN: Why are modern devices so terrible at accurate timekeeping?
I have a server to which I constantly synchronize all of my important files as a read-only backup. Now, the issue is that when I make a change on device A, sync it to device B, make another change there, and then sync it to my server - then there is a reasonable chance that the timestamps of the two changes will either overlap or be in reverse order! This will, obviously, lead to data loss.
What boggles my mind is how easy this would be to fix! Windows already uses NTP to sync the system clock, but it seems to do this so rarely as to regularly accrue seconds of time drift. Why? NTP sync requires negligible amounts of system resources.
Even worse is my phone. It can obviously receive GPS signals for geolocation, and therefore has access to the most accurate time signal available. But it doesn't even use it! When my phone is disconnected from any network, it will simply get less and less accurate while throwing away a constant stream of atomic timestamps from the GPS antenna. And while it is connected to the network, it suffers from the same weird reluctance to sync itself to time servers as often as possible.
Genuinely, am I missing something? My old 30$ Casio radio watch has more accurate time than all the very much more sophisticated and expensive tech around me. Why?