story
https://en.wikipedia.org/wiki/ISO_8601
Handed down by the ISO, The Great Compromise allows YYYY-MM-DD (or YYYYMMDD if you're in a hurry) but the version with slashes I'd find ambiguous and upsetting, especially early in the month.
The standard is good, and you can get it from `date -I`. Hell mend anyone who messes with the delimiters or writes the year in octal or any other heresy.
(Example - you want to know if a person is old enough to buy cigarettes, and you need to store a birthday that you can compare against the current day to see if they're legally 18 - if you store an epoch at UTC, do you store the time of day they were born? That's not the way the law works. Do you store midnight UTC? If they're currently in NY, can they but cigarettes at 7pm the day before their birthday because they're currently 18 in London?)
Sometimes you need a logical calendar date, not a point in time in the history of the universe.
If you were born in the US, can you buy cigarettes at 12:00 am on your 18th birthday in London?
I’ve never heard of age verification laws caring what timezone you were born in. In fact, you couldn’t even pinpoint this from many people’s ID cards. Plenty of US states span multiple time zones, and I wouldn’t be that surprised if there were a maternity ward out there sitting on a TZ line.
Certainly if you want to store birth dates and do age verification there is no point bothering with these issues, just store calendar date. Trivial to get date for age limit purposes.
ISO 8601 allows for 2 or 6 digit years. Truncating to 2 is just incorrect, and 6 digits is absurd. And you can read the RFC without paying ISO - and you can discuss the RFC with people who have also read it instead of relying on people using the Wikipedia page to interpret and explain ISO 8601.
I have a scheduling service at work and I keep getting requests for implementing ISO 8601 timestamps but I ignore them. RFC3339 is the way forward.
Just found a random link to it with an image search:
https://gyazo.com/d8517f72e24c38f055e17182842b991c/max_size/...
ISO 8601 does have some strange formats...
> ISO 8601:2000 allowed truncation (by agreement), where leading components of a date or time are omitted. Notably, this allowed two-digit years to be used as well as the ambiguous formats YY-MM-DD and YYMMDD. This provision was removed in ISO 8601:2004.
(That's from https://en.wikipedia.org/wiki/ISO_8601 - I don't have the standards handy, ironically.)
Honestly I'm happy with either the RFC or ISO, but it seems like most normies haven't heard of RFCs so ISO is my default.
Totally insufficient for capturing important future events like the dead of the sun.
Conversion to UTC is not injective e.g. when clocks change or politics happen
UTC only loses information.
The problem is that can be difficult to portably determine. One wishes POSIX had an API “give me IANA time zone name for current process” which would do the needful to work it out (read TZ environment variable, readlink /etc/localtime, whatever else might be necessary)… but no, you are left to do those steps yourself. And it works reasonably well if the TZ environment variable is set, but it most commonly isn’t; readlink of /etc/localtime works on macOS and some Linux distros… but others make /etc/localtime a regular file not a symlink, which makes it all a lot harder
And that’s POSIX. Then there’s Windows which is possibly the last platform to still use its own timezone database instead of IANA’s. Now, Unicode CLDR maintains a Windows-to-IANA mapping table… but you have to ship both that table, and maybe the IANA timezone DB too, with your app, and keep them updated
I really wish Microsoft would ship the IANA database with Windows, and the IANA-Windows mapping table too, and provide APIs to query them, and keep them updated with Windows update. The core Windows OS and existing Windows apps can keep on using the legacy Windows TZ database for backward compatibility, whereas portable apps could use IANA instead if they wish
Say, for a somewhat annoying-case example, you want to store a meeting date/time that's in the future, in January, in the city of New York, New York, USA, with remote participants elsewhere in New York state. Sometime in between when the calendar invite is created and the meeting the city of New York decides to change to be on permanent Daylight Savings Time, but the rest of New York state doesn't change. If you stored only the UTC time and "America/New York" you now have an ambiguous meeting date/time, since the "America/New York" time zone split and the city of NY is an hour off from the rest of NY for part of the year, and your remote participants could get the wrong time.
There's probably an even worse case involving the death of a Japanese emperor, since the "period" portion of a Japanese date is the imperial name of the emperor who ruled at that time, and that gets retroactively applied to dates between when the new emperor's coronation and when they took their new imperial name.
My point is that this is an extremely niche case and works around one particular type of timezone insanity. You either have a team dedicated to dealing with timezone insanity, or you store stuff in UTC.
Either use dedicated "from_iso8601" functions, or manually specify the format of the input string ("%Y%m%dT%H%M%SZ")
But then discussion ensues about how programmers these days add libraries as dependencies for almost everything! :-)
I guess at some point a middle ground must be found.
> never ever use anything but ISO dates in UTC tz unless you're displaying it for a user in a UI.
Not good for storing future meeting times. DST switchover dates can change, and your tz-normalized date won't change with it.But, then I guess we might need to account for fractured societies and actually store some kind of organizational code for which belief system the event author adheres to? :-)
Internally everything is stored and handled in TAI (better than UTC as no discontinuity) and translated from/to something else for human consumption.
I.e. for instance your should have logic to figure out what TAI period corresponds to "next month's meetings" if that's what the user wants, which you apply immediately on user inputs and then forget about DST, time zones, etc. in the rest of the code and storage.
Another benefit is that if your user was in New York but is now in London it is trivial and well-constrained to adjust to local time.
An engineer in the US reviewing industrial measurements logged in a plant in Asia from a variety of sources is definitely going to encounter lots of events recorded in local time. It would be maddening for that engineer to have to review and resolve events from different time coordinates, especially if they are doing the review months or years later. It's best to accept that reality and adopt local time as the standard. Then you must record the TZ offset per UTC in any new system you create.
I use ISO for everything and your software wrongly assuming I want a deranged lunatic date format based on some locale is not going to cut it.
Locale is ok as a first guess, but maybe allow users tho make that choice?
Special shoutouts to the author of node-postgres saying the PG's date type is better not used for dates in this case.[1] I love programming.
[1] https://node-postgres.com/features/types#date--timestamp--ti...
Is it sane? Is midnight at the start of a day, or the end of it? I'd think noon would be less ambiguous, and significantly less prone to these timezone issues (although this may not be a benefit).
Midnight at the start of the day: 00:00:00
Midnight at the end of the day: 24:00:00
Because the other languages & with bigger runtimes and more comprehensive standard libraries such as Java applets, Microsoft Silverlight, Macromedia Flash that were promoted for browsers to create "rich fat clients" were ultimately rejected for various reasons. The plugins had performance problems, security problems, browser crashes, etc.
Java applets was positioned by Sun & Netscape to be the "serious professional" language. Javascript was intended to be the "toy" language.
In 1999, Microsoft added XMLHttpRequest() to IE's Javascript engine to enable Outlook-for-web email that acted dynamically like Outlook-on-desktop without page refreshes. Other browsers copied that. (We described the early "web apps" with jargon such as "DHTML" DynamicHTML and "AJAX".) In 2004, Google further proved out Javascript capabilities for "rich interactive clients" with Gmail and Google Maps. Smoothly drag map tiles around and zooming in and out without Macromedia Flash. Even without any deliberate coordinated agenda, the industry collectively begins to turn Javascript from a toy language into the dominant language for all serious web apps. Javascript now had huge momentum. A language runtime being built into the browser without a standard library like Javascript was prioritized by the industry more than the other options like plugins that had a bigger "batteries included" library. This overwhelming industry preference for Javascript happened before Node.js for server-side apps in 2009 and before Steve Jobs supposedly killed Flash in 2010.
The situation today of Node.js devs using npm to download "leftpad()" and a hundred other dependencies to "fill in the gaps" of basic functionality comes from the history of Javascript's adoption.
Java's Date standard lib was awful for 2 decades, so there's no guarantee that a big standard library is a good standard library.
JS was a compromise. It had to be sent out the door quick, it needed to look sufficiently like Java to not upset Sun who were trying to establish Java as the universal platform at the time while not being feature complete enough to be perceived as a competitor rather than a supplement. And it had to be standardized ASAP to pre-empt Microsoft's Embrace Extend Extinguish strategy (which was well on its way with JScript). That's also why it's an ECMA standard rather than ISO despite Netscape not having been based in Switzerland - ECMA simply offered the shortest timeline to publishing a standard.
I think what's more amazing isn't just how we managed to build the bulk of user interfaces in JavaScript but how we Node.js managed to succeed with ECMAScript 3. Node.js was born into a world without strict mode and without even built-in support for JSON: https://codelucky.com/javascript-es5/ - and yeah, ECMAScript 3 was succeeded by ECMAScript 5 not 4 because it took vendors 10 years to agree on how the language should evolve in the 21st century - not only did we build the modern web on JavaScript, we built a lot of the modern web on the version of JavaScript as it was in 1999! Even AJAX wasn't standardized until 2006 when Web 2.0 was already in full swing.
That is not a better world.
also this could've been handled easily by committee(ugh), or a 3rd party open source organization akin to linux foundation that just makes js library that all browser vendors use. Or making just a specification for JS handling things.
you know - like a lot of other languages are handled, including their standard library.
I mean, even Microsoft gave up and just went with Chromium, and they got the definition of almost infinite resources at their disposal.
Effectively if your website doesn't run in Chrome and Safari, it won't be seen by 99% of the market.
Ah yes, Microsoft, the defenders of the free world.
Not a better world, just the current world.
[Ok that's enough, ed.]
What I'm looking for is "there has to be a library function for that; I would look it up".
JavaScript even with all its thorns, it's a very consistent ecosystem nowadays, across browsers and architectures. Being forever retro-compatible is a good thing, meaning that most code that ran during the 2000s, can still be run with minimal change.
By that, I don't mean to dismiss the importance of backward compatibility, but this case is particularly funny because:
1. It had already been changed multiple times, each a breaking change, so it’s not like this form of compatibility was ever seriously respected;
2. Having it behave differently from other "legacy forms," like the slash-separated version, is itself arguably a break in backward compatibility;
3. As noted in the article, it never worked the same between Chrome and Firefox (at this point) anyway, so it’s doubtful how impactful this "breaking change" really was, considering you already had to write shim code either way.
console.log(new Date('2025/05/28').toDateString());
console.log(new Date('2025-05-28').toDateString());
console.log(new Date('2025-5-28').toDateString());
OutPut Below
Wed May 28 2025 debugger eval code:1:9 Wed May 28 2025 debugger eval code:2:9 Wed May 28 2025 debugger eval code:4:9
We swedes use standardized ISO 8601 dates such as YYYY-MM-DD as dictated by our excellent government and you find it in use in our social security number, government correspondence and mostly everywhere.
Same here in germany! ...Which is the reason why everyone ignores it in favour of the traditional format.
I love democracy, and also mountain-shaped temporal unit ordering ^ It's 28.05.2025 13:15.
Text-ordering by date is a nightmare because everything is first grouped by day-of-month, then month, then year! :)
Falsehoods programmers believe about time gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b923ca
- Understand the semantic difference between a timestamp (absolute time) and clock/calendar time (relative time). Understand which one your use case uses. Don't use one to store the other.
- If the use case calls for a relative time, do not manually construct or edit the date. Use your platforms date-creation/modification APIs, no matter how unnecessary they seem.
- Understand what is inside your platform's date types at rest. Understand which of your platform's date APIs pull in environmental information (time/tz/locale), as opposed to only using the arguments you pass it. Understand that your platform's 'print/stringify' function may be one of those aforementioned functions. Misunderstanding this often leads people to say inaccurate things. E.g. say your platform has a Date object that stores an epoch-based timestamp. People may say "the Date object is always in UTC", when really the Date object has no time offset, which is not the same thing.
- Understand that if you pass a date around platforms, it might accidentally be reserialized into the same absolute time, but a different relative time.
- Understand that there is a hierarchy of use cases, where each one has more complex requirements:
1. "Create/modify" timestamps; egg timers. (absolute time)
2. Alarm clocks (same clock time always).
3. One-time calendar events (has an explicit, static tz; same clock time if the user changes its day or time zone; different clock time if the user's time offset changes)
4. Recurring calendar events (same as above, except don't change the clock time if the user's time offset changed due to DST, as opposed to a geographic change)
5. Recurring calendar event with multiple participants (same as above, just remember that the attached tz is based on the creator, so the clock time will shift during DST for participants in a place without matching DST rules).
Note that a lot of platforms nowadays have built-in or 3rd party packages that automatically handle a lot of the rules in the above use cases.
Finally, understand that all those little weird things about dates (weird time zones, weird formatting conventions, legislative time zone changes, retroactive legislative time zone changes, leap days, leap seconds, times that don't exist), are good to know, but they will mostly be accounted for by the above understandings. You can get into them when you want to handle the real edge cases.
Being available everywhere (as far as browsers are concerned) trumps almost all other factors.
The real "bug" in the example is 2025/05/28 being May 28th because the implementation ignores timezones for that format.
The issue with `Date` is that it is based on the original `java.util.Date` class and inherits all of its problems: https://docs.oracle.com/en/java/javase/21/docs/api/java.base... - this is also where the wonky zero-indexed month value comes from. Note that Java deprecated all versions of the constructor other than the one taking a millisecond value or nothing, which JS can't do for backwards compatibility reasons.
Hopefully `Temporal` will solve these problems but how long that spec has been in the works should tell you how difficult this is to get right when anything you put on the web is forever.
unless a timezone/offset is given, its considered plain date/time, not an epoch.
- `Temporal.Instant` is a timestamp, the equivalent of a millisecond epoch timestamp or a UTC ISO string.
- `Temporal.PlainDateTime` (and `Temporal.PlainTime`, `Temporal.PlainDate`, `Temporal.PlainYearMonth` and `Temporal.PlainMonthDay`) is timezone-unaware (i.e. they don't represent an actual point in time unless qualified with a timezone, e.g. when you just want to reference a calendar date).
- `Temporal.ZonedDateTime` is timezone-aware (i.e. what you usually mean when dealing with specific instants in time).
There's also `Temporal.Duration` which represents a delta between two instants and can be used to perform date manipulation (e.g. adding exactly one day or year to a given instant).
I guess in terms of Postgress, `Temporal.ZonedDateTime` and `Temporal.PlainDateTime` are similar to `timestamptz` and `timestamp`, `Temporal.PlainDate` and `Temporal.PlainTime` are similar to `date` and `time` (without a timezone) and `Temporal.Duration` is similar to `interval`. It's odd that postgres allows for `time` to have a timezone but I guess that's only meant to be used when storing an instant as a set of `time` and `date` rather than a single `timestamptz`.
`Temporal.PlainYearMonth` and `Temporal.PlainMonthDay` seem like the odd ones out but they make as much sense as `Temporal.PlainDate` if you want to reference specific calendar dates/months with incomplete information. I guess for sake of completeness one could argue for a need to be able to specify a precision for `Temporal.Time` to distinguish between a reference to "8 am", exactly "8:00", exactly "8:00.0" and so on, but that use case seems a lot more niche than those solved by the additional date types.