> But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time.
other way around. e.g. unix time is determined by the number of seconds (as experienced by some point fixed to Earth, roughly) relative to some reference point. duration is the native format for most time systems, like UTC, because “absolute time” isn’t a thing that can be measured. faking the duration (e.g. adding leap seconds) within a time system is sort of nonsensical: we only do it to make translation across time systems (UTC, UT1, local/human timezones) simple. if that’s the justification for things like leap seconds, then better to keep the time system itself in its most native format (assuming that format is useful locally) and explicitly do those conversions only when translating, i.e. at the network boundary: when communicating with a system that doesn’t share/understand our time system(s).