> it doesn't mean literally seconds since epoch, but instead some unix-style monstrosity
That "unix-style monstrosity" is literally seconds since the UNIX time epoch, which is unambiguously defined as starting on 1970-1-1 0:00:00 UTC.
Or it would have been, had leap seconds not been forced upon the world in 1972, at which point yes, arguably it's no longer "physical earth seconds" since the epoch but "UNIX seconds" where a day is defined as exactly 86400 UNIX seconds.
In retrospect, it'd have been better if UNIX time was exactly a second, and the leap seconds accounted for by the tz database, but that didn't exist until over a decade after the first leap seconds were added, so probably everybody thought it was easier just to take the pragmatic option to skip the missing seconds, exactly the same way that the rest of the world was doing.
I'm still not sure if that's what your complaint is about, as I don't know of time systems defined any other way handle this correctly if you were to ask for the time difference in seconds between a time before and after a leap second.
Maybe a better question would be: what do you think would be a better way of defining a representation of a date and time, and that would allow for easy calculations and also easy transformations into how it's presented for users?