Edit: here was the front page of the New York Times at 1600000034,
https://web.archive.org/web/20200913122714/https://www.nytim...
and here's 1500000301 and 1400000634, and 1300007806
https://web.archive.org/web/20170714024501/http://www.nytime...
https://web.archive.org/web/20140513170354/http://www.nytime...
https://web.archive.org/web/20110313091646/http://www.nytime...
My own blog post here commemorating the event: https://susam.net/maze/unix-timestamp-1600000000.html
Given that 100 000 000 seconds is approximately 3 years 2 months, we are going to see an event like this every few years.
I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
By the way, here's 1700000000 on Python:
$ python3 -q
>>> from datetime import datetime
>>> datetime.utcfromtimestamp(1_700_000_000)
datetime.datetime(2023, 11, 14, 22, 13, 20)
>>>
GNU date (Linux): $ date -ud @1700000000
Tue Nov 14 22:13:20 UTC 2023
BSD date (macOS, FreeBSD, OpenBSD, etc.): $ date -ur 1700000000
Tue 14 Nov 2023 22:13:20 UTCEgads! 33 years! I spent my late 90:ies mudding[0] and for some reason we had a lot of save files named by their epoch timestamp. When I ended up responsible for parts of the code base, I spent a lot of time dealing with those files, and they were all in the 800- or 900- million range. At some point I was pretty much able to tell at a glance roughly what date any number in that range corresponded to, within perhaps a few weeks.
Weird environments foster weird super powers.
"Tonight I'm gonna party like it's (time_t) 1E9"
I might even get to experience 3333333333 if I am lucky. What a day, what a day, yes indeed!
“Boss! We’re being dee dossed!”
“No, son, it’s Tuesday”
I wonder if you can find a shirt that would print that
The system stored timestamps as a string representing seconds since the epoch, but it assumed it would fit in 9 digits. At 1000000000, it started dropping the last digit, so it went back to Sat 1973-03-03 01:46:40 PST, advancing at 10% of real time. It was fixed fairly quickly.
2 days later...
The head of the derivatives tech support team pointed out it was about to hit so we opened up a shell and did a "watch" command + outputting the "date" command in epoch seconds and watched it happen.
Then we went back to working.
[0] "How many seconds are there in a year? If I tell you there are 3.155 x 10^7, you won't even try to remember it. On the other hand, who could forget that, to within half a percent, pi seconds is a nanocentury." --Tom Duff
But it's reallly strange to try to map the Emergent political structure onto any modern political axis. It's not "liberal progressive" or "traditional conservative" or "libertarian". Or any other popular political ideology. It's certainly authoritarian, but uniquely so. It's almost a dystopia run by project managers and exploiting specialists.
Also a fun bit: The traders in the book count base their epoch on the first moon landing, but if you pay attention, the lowest levels of software count from a different epoch.
Assuming I live that long, the next day will be my 65th birthday. Just in time for digital Armageddon.
[1] https://github.com/neomantra/tf
brew tap neomantra/homebrew-tap
brew install tf
Printing out these round ones. `tf` auto-detects at 10-digits, so I started there in the `seq`. > for TV in $(seq -f %.f 1000000000 100000000 2000000000); do echo $TV $TV | tf -d ; done
2001-09-08 18:46:40 1000000000
2004-11-09 03:33:20 1100000000
2008-01-10 13:20:00 1200000000
2011-03-12 23:06:40 1300000000
2014-05-13 09:53:20 1400000000
2017-07-13 19:40:00 1500000000
2020-09-13 05:26:40 1600000000
2023-11-14 14:13:20 1700000000
2027-01-15 00:00:00 1800000000
2030-03-17 10:46:40 1900000000
2033-05-17 20:33:20 2000000000
Some funny dates. -g detects multiple on a line, -d includes the date: > echo 1234567890 __ 3141592653 | tf -gd
2009-02-13 15:31:30 __ 2069-07-20 17:37:33
Enjoy... may it save you time figuring out time!It's a special day, since the next round UNIX day is 30000, at 2052-02-20.
http://www.df7cb.de/projects/sdate/
one commit message for the QDBs:
From 14df411817feda9decf9dd8a6cd555d71f199730 Mon Sep 17 00:00:00 2001
From: Christoph Berg <myon@debian.org>
Date: Thu, 4 Jun 2020 20:05:49 +0200
Subject: [PATCH] Fix long --covid option
scripts/sdate.in | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
Sep 17 2001 (1000684800) is a special date from git-format-patch. Its significance is lost to time.@ date -d '@1800000000' Fri Jan 15 03:00:00 AM EST 2027
tss() { date -Is -u -d @$1 ; }
tsm() { date -Ins -u -d @$( echo "scale=3; $1 / 1000" | bc) | sed -E -e 's/[0-9]{6}\+/\+/' -e 's/,/./' ; }
tsn() { date -Ins -u -d @$( echo "scale=9; $1 / 1000000000" | bc) | sed 's/,/./' ; }
$ tss 1700000000
2023-11-14T22:13:20+00:00
$ tsm 1700000000000
2023-11-14T22:13:20.000+00:00
$ tsn 1700000000000000000
2023-11-14T22:13:20.000000000+00:00Except for the utterly unwieldy binary, none of those bases adapt well to the bases used in representing time, which are mostly the (partially related) bases 60, 12, and, annoyingly, thirty-ish.
So you always end up doing opaque arithmetic instead of “just looking at the digits” (which you still can do in decimal for century vs years for example, because we defined centuries to be exactly that).
Why?
Was it being used in 1970 and actually started at 0?
Or did they just pick a date to start it and if so what was the initial Unix time when it was first used?
>"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."
> date -d '@1600000000'
> date -d '@1700000000'
Is there talk anywhere of using a human-readable timestamp instead? e.g. YYYYMMddHHmmssSSSSZ