When the previous leap second was applied, a bunch of our Linux servers had kernel panics for some reason, so needless to say everyone was really concerned about a leap second happening during trading hours.
So I was assigned to make sure nothing bad would happen. I spent a month in the lab, simulating the leap second by fast forwarding clocks for all our different applications, testing different NTP implementations (I like chrony, for what it's worth). I had heaps of meetings with our partners trying to figure out what their plans were (they had none), and test what would happen if their clocks went backwards. I had to learn about how to install the leap seconds file into a bunch of software I never even knew existed, write various recovery scripts, and at one point was knee-deep in ntpd and Solaris kernel code.
After all that, the day before it was scheduled, the whole trading world agreed to halt the markets for 15 minutes before/after the leap second, so all my work was for nothing. I'm not sure what the moral is here, if there is one.
> He went away from the basement and left this note on his terminal: "I'm going to a commune in Vermont and will deal with no unit of time shorter than a season."
[0] https://en.m.wikipedia.org/wiki/The_Soul_of_a_New_Machine
– Seemingly simple tasks can be more complex than you expect (“add a leap second on this Wednesday”)
– Real world systems can be more complex than you expect (“bunch of software I never even knew existed”)
– Planning and testing can make a big difference vs. just winging it (“a bunch of our Linux servers had kernel panics for some reason”)
– Success can be a non-event that goes unnoticed (”everything worked and no money went missing”)
– Sometimes the best solution is not a technical solution (“halt the markets for 15 minutes before/after”)
We've had an election recently, right on the day when DST changed. On the night of counting of the votes, the clock went 2:59 AM -> 2:00 AM.
To save themselves trouble the Statistics Office instructed all vote counters that under no circumstances are they to enter or update anything in any system during the repeating hour until it's 3:00 AM the second time…
I once came across an early 1950s Scientific American article by Bertrand Russel, IIRC. It included a cartoon.
Frame one: Computer beats man at chess.
Frame two: Man unplugs computer.
And yet, there are still Y2K deniers (to be fair some people have exaggerated it to the point that they're promoting it as the end of the world).
I'm little confused. How does this solve the problem? If you don't code for the second, you'll still be off if you wait. I'm I missing something?
It always pays to not be the least-prepared among your cohort. You'll get no sympathy if you're at the back of the pack, you'll just die.
This is a hugely valuable learning experience few people even get a chance at, let alone solve. Too bad it doesn’t show up on your resume is the only downside!
Interview discussion? If you're any good at interviewing, it should.
...okay yeah that's not a moral, but still.
$work had thousands of full custom, dsp-heavy, location measurement hardware devices widely deployed in the field for UTDOA locating cell phones. It used GPS for time reference -- if you know your location, you can get GPS time accurate around the 10's of nanoseconds. GPS also broadcasts a periodic almanac which includes leap second offsets: if you wanted to apply the offset to GPS you could derive UTC. Anyway there were three models of these units, each with an off-the-shelf GPS chip from one of three location vendors you've probably heard of. The chip firmware was responsible for handling leaps.
One day, a leap second arrived from the heavens. We learned the three vendors all exhibited different behaviors! Some chips handled the leap fine. Some ignored it. Some just crashed, chip offline, no bueno, adios. And some went into a state that gave wildly wrong answers. After a flurry of log pulling, debugging, console cabling, and truck rolls, we had a procedure to identify units in bad states and reset them without too many getting bricked.
It seems the less likely an event is to occur, the less likely your vendor put work into handling it.
Idle curiosities can lead to their own waste, but the kernel panic was probably worth digging into earlier.
Apparently, its about as useful as the leap second itself ;)
I feel your pain though, as I've spent weeks on something only for it to be tossed away like it was nothing at the last second. I guess that's how Google devs feel when their projects are deprecated. At least theirs saw the light of day and provided some validation
It is clear to me that the disparity of latency creates islands of privilege. I mentioned this to someone in the industry once and they replied that what the layman perceives as parasitic middlemen actually provide valuable liquidity. When I asked whether they considered ticket-scalpers to likewise provide liquidity they claimed that was not at all the same thing.
I think the moral is that it'd be a lot easier if we could just stop messing with the clocks, or at least push more technical things towards only caring about a closest-to-a-global-high-precision-monotonic-clock-as-relativity-allows rather than worrying about what the clocks on the walls say, which is more a personal matter of how much you care or don't where the sun is in the sky at 12:00:00.000.
Edit: I guess the other way to look it is I guess now how much you can make on a few minutes of trading, seeing that it was worth putting at least one software engineer on it for a long time despite the risks...
As the CIA director in Burn After Reading says, "I guess we learned not to do it again."
So his work contributed to community wisdom, and that influential community has probably had some say in cancelling leap seconds. I wouldn't call his work wasted. I would call that notably few degrees-of-separation in making an observable difference.
Always procrastinate :-)
Sometimes the best (for some definition of best) solution to a problem is to side-step it entirely.
You're conscientious and willing to dig in to the details to fix a problem. Plenty of people aren't, and plenty of those are doing the same job as you. Look up from your own little world and try to figure out what other people are doing, how they're doing it, and why. This applies generally: If you fixate on a specific language or toolkit, you'll miss others which fix or obviate bugs you were resigned to living with. Same with OSes and environments. It even applies to relationships, which is why a big hallmark of abuse is isolating the victim.
Do you mean at your desk? What is a lab in a fintech context?
If we kick the can down the road such that eventually we'll need to add a leap minute, we're going to end up with software that was never written to expect time to change in such a way, hasn't had a real world test of the change for decades, and will have no one working on the software who ever had to deal with such a change.
It's going to be much worse for software reliability to have a leap minute on the order of once a century than a leap second every few years.
Eventually, thousands of years from now when time has drifted by an hour or more (assuming modern technological civilization even still exists by then), each jurisdiction can just change their time zone's offset from UTC, without coordinating with anyone else. Jurisdictions making time zone changes is a well-understood situation that we already know how to deal with.
Leap seconds are an architectural blunder that always belonged in the abstraction layer that lines up the sun with the rotation of earth (the time zone abstraction). It never belonged in the part that counts seconds.
Thanks to glaciers melting, earth rotation is (temporarily) accelerating. Because of that, positive leap seconds, regular before, didn't happen since 2017 - so there could very well be (recent) software out there that has that code-path broken, and nobody noticed yet.
And due to exact same geophysical effect we might see a negative leap second - something that never ever happened before. What are the odds that every single piece of software gets that one right?
Be interesting to estimate the size of the various effects (no doubt I've missed plenty of others) but is it really true that a change in sign of the acceleration of Earth's angular velocity is down to climate change?
Even over thousands of years when an hour of drift is accumulated there won’t be a manual adjustment - people will have just gotten used to different times of day having sunlight, with generations having been born and died with mean solar time happening at 11am.
Eventually the rotation of the earth may change enough that drift accumulates too quickly and leap time needs to be added, but that’s only going to be true thousands to tens of thousands of years in the future.
That'd test all the software paths.
The daylight savings time bugs I've run into at pretty much every company I've ever worked at would beg to differ.
So this isn't a once-a-century thing, it's adding-a-leap-15-minutes-once-a-millenia issue.
Odd. I was wondering how fast it actually was long term, and this rate from historical record seems much lower. They cite 1.8ms/century if I'm reading it correctly with some odd cyclical thing going on. " the change in the length of the mean solar day (lod) increases at an average rate of +1.8 ms per century. "
I mean, we've added 22 seconds over 50 years. Although at current rate it would still just be 7 minutes after a millenia :)
edit You know, nevermind, that's all covered on wikipedia. https://en.wikipedia.org/wiki/Leap_second#Slowing_rotation_o...
Plenty is not.
The Swift time type ignores them in its implementation. I filed an issue, and they said there were no plans to implement them.
Good choice it turns out.
Who would of thought that adding a second on random new year changeovers would be worse than letting clocks drift.
Me for one.
We can have a "leap hour" in a thousand years. Till then do we care, I do not, if the clocks and the sun drift very slowly apart from each other?
I can just imagine Graybeards of the future rushing ahead of the leap minute to update the JVM lest the world goes in flames yet again.
Earth's rotation relative to the Sun is whole another deal.
- the measurement of durations.
- the presentation of some timestamp in a way that the reader has some intuition for.
that first purpose won’t be hurt by not tracking leap seconds. actually, a lot of applications will probably more accurately measure durations by eliminating leap seconds.
if leap seconds (or minutes) really are of critical importance, we’ll reintroduce them to the presentation layer. the thing is, very few people can tell the difference between 12:01 and 12:02 without being told the “real” time. so if you’re presenting a time which is “off” by a minute because there’s no leap seconds… does it really matter?
1) Seconds since 00:00:00UTC 1.1.1970. This value increases by 1 each atomic second and never goes forward/back. Call this Universal Monotonic Time.
2) The difference between when the sun is at its zenith at Greenwich and 12:00 UMT. Call this the Astronomic Drift.
3) The timezone - offset from Greenwich that makes the local clock sync up with astronomic time and also contains a DST offset if that location observes DST at that date.
By adding up 1) + 2) + 3) you end up with the "human time" at a given location at a given date.
A computer system should only ever store 1). Then, it can calculate human time when displaying it to humans.
I'm also a fan of having "local sun time" which would be the time according to the position of the sun in the sky, quantised to 15-minute slices (basically micro-timezones). It would be nice if office hours, school times, &c can be defined based on that, i.e. work starts at 9am local sun time, which will sync up better with people's biological clock and cut down on the yearly stress DST causes.
>If we kick the can down the road such that eventually we'll need to add a leap minute
The drift is not in a single direction. The total drift is not going to be significant for a very long time if ever.
That doesn't stop the (e.g.) FreeBSD folks run tests to make sure things are fine:
* https://lists.freebsd.org/pipermail/freebsd-stable/2020-Nove...
* https://docs.freebsd.org/en/articles/leap-seconds/
Of course there's a whole lot of other userland code besides ntpd.
Storing anything as UTC was a mistake and we should be using TAI for all storage and computation, only transforming into more human friendly formats for display to end users. This never needed to be a problem except we decided to make it harder to use TAI than UTC and so everything got built up off the backs of legacy bios level hardware supported UTC style clock behaviour, when we should have been using TAI from the start. Yes I know it would have been harder, but we got off our collective asses and decided to fix our short sighted decision making for Y2K date storage, why not this… if it truly costs as much for everyone to endure a leap second why wasn’t it just fixed from the bottom up and rebuilt correctly!
The point is, that there ALREADY EXIST both TAI and UTC. TAI is true monotonic (whatever it means in a relativistic universe) and doesn't make any compromises. UTC abolishes monotonicity in order to keep both the length of a second and the time relationship to the orbital rotation. They both work. For whatever reason (for obvious reasons that is, but doesn't matter) UTC was chosen in virtually any software system to keep time.
So, ok, if there is a suspicion of leap seconds being unnecessary. How about moving from UTC time to TAI then? Let's keep UTC as it is, keep adding leap seconds and just make it a best practice to rely on TAI for all datetime operations and world clock synchronization? Maybe it will work out, maybe it won't, but at least you won't be breaking a perfectly working alternative (currently — mainstream) system.
The more I think about it, the more outrageously stupid abolishing leap seconds seems.
If they weren't going to do that, then why eliminate leap seconds? Kicking the problem down the road doesn't really solve the problem, it just makes it worse later.
What was chosen really isn't UTC. Several UTC seconds in the past are not accurately representable in unixtime. Several unixtime seconds in the past are ambigious as to which UTC second they are.
Unixtime is awfully close to UTC time, but it's not the same. If UTC time stops inserting leap seconds and never has negative leap seconds, then they will be equivalent going forward.
They are recommending TAI for storage, compute, and not against UTC for human consumption
Although the easier hack: Abolish leap seconds from UTC!
Original comment: Only those who never tried to actually use TAI yourself can claim that you can use TAI instead of UTC without a problem.
That won't break anything...
I'm not sure why we don't define an intergalactic time standard and approximate that everywhere with NTP-like protocols; one monotonic clock at rest (with respect to CMBR) in free space. The second is weirdly defined/tracked in Earth's gravity well.
As long as the Earth exists and you can communicate with people there, there's no practical reason not to use an earth based reference clock.
The clocks that were used to build TAI (its a co-ordinated average of dozens of atomic clocks around the world) became sufficiently accurate that the difference in each second based on the altitude of the clock measuring it in the early 70s. As a consequence of this it was decided that as of 1 January 1977 00:00:00 TAI would be corrected to correspond to what TAI should be if measured by clocks at the geoid (mean sea level) and as a result it has no relation to altitude or accelerations. There is also (because metrologist are like this sometimes) a continually published version of what TAI was before 1 January 1977 00:00:00 but it is now named EAL (Échelle Atomique Libre, meaning Free Atomic Scale)
In addition to this, we have already designed and maintain equivalent time standards to TAI, but for the Earth's barycentre Geocentric Coordinate Time (TCG - Temps-coordonnée géocentrique) which is roughly speaking TAI for a clock orbiting the sun, where the earth moon barycentre orbits, but without the earth & moon gravitational influence... and for the entire solar system Barycentric Coordinate Time (TCB, from the French Temps-coordonnée barycentrique) which is roughly speaking again, equivalent to a TAI style clock but this time subtracting the entire solar system, as if a clock keeping TAI was just orbiting the galaxy at the barycentre of the entire solar system.
The cutting edge of this is building up astronomical data on ultra stable pulsars to use as "external" reference clocks far outside the solar system, but the complexities of subtracting the effect of all the universe the pulsars' radiation beams pass through before they get to us, makes it quite challenging. But the utility for deep space navigation has made it an active funded path of research for at least the last decade. (to get a GPS equivalent at lunar distance and beyond where it becomes rapidly impractical to have a GPS like orbiting constellation due to inverse square radio broadcast power limits, good radio can pick up GPS at the moon but the location precision out at that distance... is not great)
The cosmic microwave background dipole may be indicative that we can use it as an absolute frame of reference but settling that with enough certainty to base an official time standard on it, seems like some time away based on the state of things between cosmology, astronomy, astrophysics and metrology.
> more human friendly formats for display to end users.
This is what's doing the very heavy lifting in your proposal. Leaving aside not knowing when future leap seconds will occur (and thus getting mismatches when broadcasting to different computers that may or may not get the information about the leap seconds at different times) the sheer fact of the matter is that software developers are users too. They will take shortcuts and display TAI as UTC because "something, something people are lazy or uneducated."
We do not need leap seconds. We never should have implemented them. They are a scar on our software for potentially hundreds of years already for any application that seeks to have high accuracy over time.
Time is very frequently a join key or part of a join key in a database and these small differences mean countless hours wasted to investigate "couple of record" mismatches.
Just stop using leap seconds. We will be fine.
What applications and date-time libraries should really do is differentiate between timestamps, calendar plus wall-clock time, and elapsed runtime. In most circumstances, only the latter would really need to be consistent with TAI.
The straight up truth is that we created something in between. Some parts of earth sun alignment are in the time zone abstraction layer and leap seconds are in the seconds count layer. There's no real cause for this and we should have moved to TAI to fix this blunder long ago.
EDIT: info below is incorrect about UTC not being monotonic, as pointed out in thread but is useful for monotonic vs non-monotonic:
In UTC you can jump forward or back, so it's possible to do an operation after another operation but have a timestamp before it, which is bad for many reasons, top being auditing.
do operation one at T0 do operation two at T1 do operation three at T-1
in TAI it would always be do operation one at T0 do operation two at T1 do operation three at T2
How do you add a full day, if you do not know whether a leap second occured or not?
I had to look up TAI. I disagree. UTC exists for a reason. But I am here to tell a war story.
Before I arrived on the scene a customer said they wanted local time in the reports. (As in Wall Clock Time).
The Customer is Always Right. OK?
So the times went into the database as wall clock time.
The designers of the data schema made a simplifying decision to store everything as text.
No time zone went into the text string that described the time. (?? I do not know why. So easy it would have been)
I come along and have to code animations using that time series data.
Very few problems...
As you would expect there are a lot of other problems with that database.
So, in one century, we'll get 1 minute's worth of drift.
Recall that we all share the same clock within timezones, and 1 minute of drift between atomic & solar clocks is the equivalent of traveling 1/60th of your timezone's width to the east or west ... something many people do every day as part of their commute. _Everyone's_ clock deviates from their local solar noon, and _nobody cares_.
Put another way: (at most) one north-south line in your timezone will have solar noon & clock noon line up. Over time the relative location of that line will move. Fine. Let's not screw with our clocks in an effort to keep the location of that line fixed.
Leap seconds are a solution in search of a problem.
I find this off-hand comment dismissive and out of touch. The semi-annual switch from Daylight Savings to "normal" and back again is absurd, and far from not affecting anyone. Studies show that productivity drops for about a week following the change [1], and there is a marked increase in road fatalities after the clocks are adjusted [2].
If anything, this leap second business is what's irrelevant to everybody except a handful of obscure boffins.
[1] https://www.healthline.com/health-news/daylight-saving-can-m...
[2] https://www.boston.com/news/jobs/2016/03/16/daylight-saving-...
it happened again to me just a few weeks ago with the DST switch. i was so mad when i learned that everything had felt off the day of the switch because it was off that i just set all my clocks to UTC. no more gaslighting: my clocks all accurately measure intervals with no tricks, which is exactly what i most want out of my clock.
And the leap year problems of the Julian calendar weren't a problem… until they were. And then good luck coördinating things:
* https://en.wikipedia.org/wiki/Gregorian_calendar#Adoption_by...
* https://en.wikipedia.org/wiki/Adoption_of_the_Gregorian_cale...
The actual Unix time does not change when the clocks are switched between daylight saving time and standard time. Only the time zone changes.
When a leap second occurs, the actual Unix time changes, which can lead to bugs, e.g. when a positive time difference comes back as negative. To prevent such issues, a monotonic clock can be used to measure time intervals.
I'd argue there should be no adjustment until it gets to be +- 15 minutes from UT1. And even then, NO clock skewing, just a time zone adjustment across the board.
We already rejigger the time zones from time to time, so thats already handled. It's still a source of issues, sure, but it's once source instead of two.
The timezone rejiggering could also be set hundreds of years in advance... rather than the very short notice we get for leap seconds.
[…]
> The CGPM — which also oversees the international system of units (SI) — has proposed that no leap second should be added for at least a century, allowing UT1 and UTC to slide out of sync by about 1 minute. But it plans to consult with other international organizations and decide by 2026 on what upper limit, if any, to put on how much they be allowed to diverge.
So everything about this hasn't quite been sorted out yet.
At some point there may need to be a reckoning like was done with the calendar:
> Second, in the years since the First Council of Nicaea in AD 325,[b] the excess leap days introduced by the Julian algorithm had caused the calendar to drift such that the (Northern) spring equinox was occurring well before its nominal 21 March date. This date was important to the Christian churches because it is fundamental to the calculation of the date of Easter. To reinstate the association, the reform advanced the date by 10 days:[c] Thursday 4 October 1582 was followed by Friday 15 October 1582.[3]
* https://en.wikipedia.org/wiki/Gregorian_calendar
As annoying as handling a leap second could be, if it happens even somewhat regularly it can be testing more often. Deciding in the future to do a 'one-off' event may be more challenging from both a coördination point of view, as well as trying to handle a rare event correctly in (e.g.) code.
just build a planetary motor.
encircle the earth with ferromagnetic coils, then fire asteroids (similarly banded) at the earth in a series of near misses the magnetic drag steals or imparts angular momentum depending on the direction of the asteroid relative to the earth.
Like I said: it could become a huge coordination problem.
* https://en.wikipedia.org/wiki/Gregorian_calendar#Adoption_by...
* https://en.wikipedia.org/wiki/Adoption_of_the_Gregorian_cale...
To quote the relevant section:
> [the CGPM] decides that the maximum value for the difference (UT1-UTC) will be increased in, or before, 2035
> [CGPM requests that the ITU] propose a new maximum value for the difference (UT1-UTC) that will ensure the continuity of UTC for at least a century
I think there are a few possible interpretations of this:
- We'll readjust UTC in a century (why would you do this to yourself, please no, nobody wants this) by setting a predicted maximum that'll last 100 years
- The maximum is now 1 hour, we'll adjust clocks the same way we adjust for DST
- The maximum is infinite, UTC is now TAI + the same integral offset forever
I'm hoping for the last one, but who knows. They've once again kicked the can down the road to the next 2026 meeting to decide what the increase in max UT1/UTC difference will look like.
Our current calendar was introduced in 1582, 440 years ago [0].
We add a leap second about every 1.5 years [1].
That means in the time since our calendar was invented, we've added less than 5 minutes to our time.
Would anyone notice if noon arrived 5 minutes earlier over the course of 500 years? Especially since the position of the sun varies orders of magnitude more than that simply based on season?
Maybe we could all just agree to add 10 minutes in 3022. If we haven't switched calendars again.
[0] https://en.wikipedia.org/wiki/Gregorian_calendar [1] https://www.timeanddate.com/time/leapseconds.html
UT1 - based on Earth's rotation only, strictly 86400 seconds per day; length of each second varies; takes a heck of a lot of effort (and time) to measure accurately.
UTC - has same length of a second as TAI, but (for now) tracks UT1 to a precision of +/- 1 second. To achieve that, can have days that are 86399, or 86400, or 86401 seconds long.
None of 3 is planned for scrapping. The only change discussed in TFA is to fix UTC day to 86400 seconds (at cost of letting it drift further away from UT1).
As the article said, many countries jump forwards and backwards an entire hour twice a year and after the adaptation period we don't notice
We (or our descendants) can try to fix Earth rotation. All is needed is a big enough rotating mass with a variable speed of rotation.
We would use UTC. UT1 would end up only being used by astronomers and whoever else cares about the Earth's rotation.
By the time UTC and UT1 diverge enough to matter, humanity will have either destroyed itself or come up with a new time standard.
Anybody who cares about leap seconds should have just been using TAI all along instead of UTC anyway.
Astronomy commonly uses TAI or raw GPS time. In fact if you look at video footage from NASA control rooms and such there's usually a GPS second clock up on the wall somewhere.
The motivation for leap seconds wasn't astronomers, but to just keep civil time tied to solar time long term. However, the unpredictability of these leap second additions has proven to be pretty annoying, causing bugs and such. This is why google and others actually "smear" the introduction of their leap seconds over half the day.
Considering the current difference is 37 seconds, its natural to wonder if this is worth it. Certainly most people wouldn't notice relative to dawn dusk for a very long time, long enough that the entire concept probably wouldn't even make sense anymore. So why not just stop? That's the basic argument here.
It seems like JPL (NASA) and the scientific community have defined two currently used sets of time standards for observations from Earth and from space near Earth. https://en.wikipedia.org/wiki/Time_standard#Time_standards_f... Barycentric Coordinate Time (TCB) and Geocentric Coordinate Time (TCG) with DE430 as the current revision of their standard https://en.wikipedia.org/wiki/Jet_Propulsion_Laboratory_Deve...
Look at astronomical software, e.g. xephem. There will be a "clocks" tab that displays a number of different clocks, TAI (atomic clock, no leap seconds), UTC (global civil time), media solar time (GMT, which ISN'T UTC) and finally, derived from the aforementioned "sidereal time", which is the one you really need to adjust your telescope. Sidereal time is derived from a year with 1 more day basically, because the earth moving around the sun adds 1 more rotation of the background stars. Which is a drift of roughly 4 minutes per day.
https://en.wikipedia.org/wiki/Sidereal_time
Oh, and then there is stuff like Julian date which you need to look up the myriads of catalogues and tables you need for corrections because everything "wobbles" even more than you'd think.
Yes, dropping leap seconds will remove 1 table lookup from the above. But astronomical time systems are so complex that that change is a drop in the ocean.
For sophisticated astronomy systems the elimination of leap seconds should simplify and reduce a source of errors. As it is, since Circular T is against UTC you need to take your local source of time and worry if leap seconds have been correctly applied (apply leap seconds to gps time or make sure haven't been fed smeared NTP time or other horrors) before you can get UT1.
Fairly casual astronomy, no-- but that isn't driving international standards, and there mechanical uncertainties dominate so you'll end up doing a 1-star correction in any case which immediately corrects the clock.
standardizing leap smearing algos and constants could work
-Bottom layer is atomic clock seconds -We define targeted relationship between current UTC and atomic counter that will occur on a given day and time X -Time is interpolated to drift UTC into place by the given day and time X -Standards body can adjust time on some regular basis by its relationship to the atomic clock and publish the algo to convert from atomic to UTC
Each leap second event causes hundred of millions of dollars worth of disruption and that's not including the disruption created by leapseconds even when they're not happening (e.g. the frequent false leap seconds) or the mini-disaster we're sure to experience should there be a negative leap second (which we are still trending towards).
The delays are unfortunate because it's harder to transition applications that need UT1 to use an offset from UTC when the available time sources are still unpredictably and unreliably leaping on you (since to apply a UT1 correction you need your UT1 offset and your UTC source to agree if and how a leap second has been applied).
From a practical perspective it would be better to immediately discontinue leaping, then UTC would immediately become a stable time that adjustments could be applied against for those few applications that need them. It would also save us from a negative leap second.
Yah, one second doesn't matter, but it builds up.
This: "Or we could even decouple our sense of time from the Sun entirely, to create a single world time zone in which different countries see the Sun overhead at different times of day or night."
Shows that they are completely disconnect from human reality: "Science already doesn’t use local times, we talk in UTC."
That's great for science, but people care about day vs night.
I don't think it's that big of a deal; do you really care what the perception of "11 in the morning" is for someone 1,000 years ago? This kind of thing is pretty cultural anyway, and a slow drift over a thousands of years doesn't really matter.
The main reason we have the "new" Gregorian calendar is because of religious reasons, not because people were having huge practical problems with the old (slightly less accurate) Julian calendar.
Plus the current leap second system won't really deal with the long-term drift anyway because the earth's rotation keeps slowing, so in a few hundred years we'd need more leap seconds than the current system allows, and eventually we'd need a "leap second" every day because the day is a second longer (around the year 6000 IIRC).
Less than you think. A part of the leap second is correcting an unpredictable random walk. The random walk part does not add up.
The linear drift part does add up. We're still talking several thousand years to end up with just an hour offset.
Just switching to a new timezone every few thousand years ("As of Jan 1st 6022, fifty years from now, all usage of Eastern timezone will switch to New Eastern timezone, which is an hour ahead.") would handle your civil usage concern fine.
(and then in year 10022 they can switch to New New Eastern timezone, if there are any survivors of world war 5...)
I think this goes along the lines of the great DST debate, the US is going to do away with the changes next year and make DST permanent. Not do away with it, keep it, forever.
Fundamentally we create these abstractions and begin to rely on them and then decouple how we operate from the real world. We created the clocks to measure the position of the sun in the sky, now we ignore the sun entirely, the clocks are god now. I don't think it's a good thing generally speaking, but on this particular issue, I don't think people are going to really feel anything different happening, it will decrease the engineering burden to constantly maintain and update systems, which means decreased resource consumption, with no noticeable impact on the day to day lives of anyone except those engineers relying on exact time measurement.
This is obviously better, when you save a file for example you don't care about the region of the user, and thus the save timestamp of the file should really be an absolute number, so if you take your laptop and go from one country to another your whole system doesn't break because it sees timestamps in the future, same thing when daylight saving is applied and the clock is brought back/forward by an hour.
Leap seconds, if we decide that we should keep them (to me not, because the difference is so little that we would start to notice that in centuries, and who knows not which computer systems we will have but if humanity still exists), should really be handled at time zone level, by shifting of an offset that contemplates leap seconds, and not by slowing up/accelerating clocks.
Today, the solar meridian in Madrid is at 12:59 PM, while the solar meridian in Belgrade is at 11:23 AM. This is because Madrid and Belgrade are within the same time zone, in order to more easily coordinate European commerce.
If that's an acceptable tradeoff, I don't really see the issue in drifting something on the order of dozens-to-hundreds of seconds per century.
There's an exact time everyone knows the correction will happen, it's just a question of how big the correction will be.
The correction is always in the same direction -- no "one second forward, then later one second back" shenanigans.
It always happens over a weekend, so no one has to deal with real-time work-time issues.
While maybe some have to work a weekend, at least it's not near the year-end holidays.
It only happens once every 100 years.
A very interesting idea, but probably much too progressive.
In 1972 it was decided that the best way to compensate for this is to insert (and sometime remove) leap seconds, so that the difference between UTC and Earth’s rotation is kept below one second. A certain subset of astronomical and navigational/satellite applications rely on that condition being true. If this is changed, some decades-old systems, some of which may be critical infrastructure, may have to be substantially modified to account for leap seconds in some other way. The mention of GLONAS in the article is one such example.
In the international standard body responsible for UTC (ITU-R), there was up to now no sufficient majority in favor of abolishing leap seconds, due to those concerns. (Never change a running system, so to speak.) By now it has become apparent that the benefits of dropping leap seconds should vastly outweigh the potential drawbacks, at least for the next few decades. But it took some time for that realization, and probably also some older participants whose minds couldn’t be changed to die off.
So, I had limited time manipulation capabilities. I tried to write all the date handling stuff by hand (with a lot of "if" statements for the special cases). I recall trying to make leap seconds work, but not sure if I actually did. It worked well enough for my purposes.
Also, no database, so I did it all with flat files. Worked better than you might expect (thanks to built-in flock()), but I wouldn't recommend it.
power prefix symbol
10^27 ronna R
10^−27 ronto r
10^30 quetta Q
10^−30 quecto q
c.f. https://www.bipm.org/documents/20126/64811223/Resolutions-20... (Resolution 3, English version on page 23)If we decide that we absolutely need to keep our time in sync with the rotation of the earth, really what should be done is define a timezone with all the leap seconds applied, and use that timezone to only display it to the end user. Not change the way we sync computer clocks for no reason! NTP shouldn't contemplate leap seconds, for example...
The cost/benefit at the time didn't look so bad because the world wasn't full of distributed synchronized computer systems, so the added 'cost' of leap seconds was just some make work for geeks in national timing labs.
The cost benefit is very different today.
We can keep civil time roughly aligned with the sun by moving timezones an hour every four to five thousand years.
Applications that want to give accurate sidereal time or very accurate sun-up sun-down can use predictions of UT1. Bonus: it's a lot easier to give an accurate UT1 when you don't have to worry that leapsecond (mis)handling has screwed up your underlying clock.
That's another part of the cost model that has changed: When leap seconds were created it would have been burdensome to carry around an additional offset in time transmissions for those few applications that want a more accurate mean solar time. But today its fairly easy.
Afaik the most vocal opposition to reforms came from the US
[1] https://en.wikipedia.org/wiki/World_Calendar
[2] https://en.wikipedia.org/wiki/International_Fixed_Calendar
[3] https://en.wikipedia.org/wiki/French_Republican_calendar
I wouldn't be too shocked if they asked "what uses leap seconds", got an answer, and are advocating on that basis ("We have systems that use leap seconds!"), without having actually asked better questions like "What would require expensive changes if no more leap seconds were issued?"
This is surprising to me. Is the Earth's rotation so arbitrary?
The more precise you want to be, the more complex and numerous physical phenomena you need to consider become; with a system as involved as the Earth, at some point you get to weather-like chaotic behaviour, so no practical amount of additional precision in input data helps anymore.
The earth even speeds up and slows down in response to stuff like earthquakes and volcanoes.
It is really convenient that the date changes whilst we all sleep. It makes 'today' and 'tomorrow' weird.
https://www.bipm.org/documents/20126/64811223/Resolutions-20...
https://www.forbes.com/sites/jamiecartereurope/2022/08/03/do...
Also why does it say the earth is slowing down but this year it sped up. Sounds quite impossible?
Edit: TLDR; Local maxima vs. overall trend.
Why does being out of sync with the Earth's orbit matter?
Oh wait, that's not how it works. And neither is it how leap seconds work.
Leap years have to do directly with the sun, what day and time the equinoxes happen every year and the like. This time noticeably drifts every year, even every day, computers or not.
These leap seconds and what not have to do with very sensitive instruments measuring time very precisely. These time measures are entirely about machines.
Does that sound bananas to you?
Of course it is. That's what we're doing with leap seconds. Only some people smear and some people leap and they don't even do it over the same time or in perfect synchronicity.
Just ditch the leap seconds. They are not worth the cost.
Move what is leapsecond independent to UT1, the atomic clock basis -- google, finance, etc now happy
Keep UTC etc as is, since offests happen all the time to every other human clock anyway for political or summer time or country boundary changes anyway -- everyone else happy
And calculating UTC as offests from the consistent base of UT1 sounds like the way to do it
In years to come historians will be having a good old chortle about how we managed to come up with two times that were 37 seconds apart.
In retrospect, fiddling with computer clocks like that was bound to be a nightmare.
https://www.newscientist.com/article/2344401-annual-us-clock....
The great leap-minute crisis of 2147 will be interesting.
The decision was made by representatives from governments worldwide at the General Conference on Weights and Measures (CGPM) outside Paris on 18 November.