My hope for Usenet back in the day was a fully decentralized implementation that would allow each Usenet client to act as a client as well as a mini-server to cut the middleman out. Freenet (and I2P, I think) was sort of based on this idea at a very high level, but went in sort of a weird way (and didn't build on NNTP either).
So if you want Usenet, right now, every $5/month minimal VM that I'm aware of has more than enough CPU, RAM, disk and I/O to support you and two dozen friends, as long as you don't take binaries froups.
How much of that was due to alt.binaries.*?
That's a feat, considering the Pentium II wasn't released until 1997, and the 400 MHz version late 1998.
In the UK, Demon Internet in the 90s maintained a usenet server for customers and one for non-customers (pubnews.demon.co.uk). They carried plenty, had decent retention, and were very well maintained. And busy too. Demon were the largest consumer ISP at the time IIRC. So, at least over here, what you say doesn’t match my recollections.
(I worked at Demon in 1998; if either news server went down I had to fix it if I could, or raise 3rd line expertise otherwise, 24/7)
Gnutella started as this, but for a lot of reasons it didn't work for what it did, with every node sending broadcast to every other mode. The network became a number of federated super-nodes, or ultra-peers, or whatever you'd call them. These would be nodes that had a decent uptime, had a steady network connection, could handle a number of connections etc.
These days, with reasonably cheap virtual servers and reduced Usenet traffic, you can just run your own server after finding one or two peers to exchange articles with. The Debian packaging of INN is quite good, I think. It's still a bit of work to set up things, but so is reading and writing articles.
All of this applies to the text-only Usenet. A lot of for-pay Usenet was actually about access to binary-only groups with content of questionable copyright status (or even legality, depending on country). I don't know if that's still a thing today. The bandwidth requirements for these binary-only groups could be significant.
Back in the day I remember dreaming ways to optimize the NNTP server- surely there was a better way than storing each article in a separate file.. but actually it was a very convenient to have a shell account on the same system that ran the server. This is why I enjoyed TheWorld (Software Tool & Die). You could cat the articles if you wanted.
/usr/spool/news, C-News, innd (Rich $alz!), it's all coming back..
[0] - https://www.techradar.com/best/best-usenet-providers
[1] - https://teddit.zaggy.nl/r/usenet/comments/6l8h82/free_usenet...
You try to find the organizations who can catch and index posts from the people you're interested in, at the lowest cost.
It is uncomfortable, but I think the fediverse may need to be more ephemeral than most centralized social networks today. If you shout into the void and nobody hears you...a centralized network will archive your thoughts, but a decentralized one will let them die out as soon as your server does.
It may not be a bad thing. People will re-upload useful content and personally, I'm glad that my 12-year-old self's online musings are lost to the wind.
https://www.wired.com/2001/02/google-buys-deja-archive/
This was integrated into Google seearch, it worked great for many years, and then it seems Google slowly chipped away at access to the archive and now it all but gone.
I know about the various Usenet archives on archive.org (utzoo (rip), some CD usenet archive), but I want that Deja archive!
https://groups.google.com/g/rec.arts.drwho/search?q=before%3...
Google literally embraced and extinguished it by first incorporating the message base into the web search, but apparently that wasn't enough, so they at first hid the "Discussions" search option, so that users had to enable it either by some search string fu or with 3rd party browser extensions, then killed it entirely. From that moment, a search that once would return links to people discussing X now would mostly return links to companies selling that X. The rest is history.
https://www.seroundtable.com/google-discussion-search-dead-1...
Usenet clients gave their users so much power, it's almost criminal that modern news readers have yet to catch up to them several decades later.
And there's the problem: when you give users power, it's harder to sell their eyeballs to advertisers.
Indeed and I don't think people realize how great these were. There's some kind of fallacy you often read when you mention that, which to me can be summed with the following bogus reasoning: "Usenet got killed by the Web, so anything Usenet and Usenet clients had were wrong ways to do things".
Killfiles/scorefiles were better than any system I've used since then. Nothing even comes close.
Right? It was such a good time :-) You're welcome.
> why usenet couldn't be a viable social media platform
I wonder if identity plays a role here. Centralizing points (likes, retweets, etc) drives people to work on getting attention with their posts, to drive engagement, but also invites troublemakers and controversy.
Not that I'm bitter...
To add some context to those not from that era. Dejanews was the only practical way to search USENET at all, so it was effectively a USENET search engine with a crude web client. It was pretty darned crude, but was also much easier than the many NNTP clients. Most people I knew used both pretty effectively as USENET grew.
It was simple and functional, only to be slowly eroded by Google News.
Have no idea where Google News is now. But Dejanews was in many ways a Reddit for that era.
Today? Geez. No comment. Every meaningful relationship I have online is from people I knew and met in real life, some of them a decade or more ago.
That's one of the most important thing most people ignore: they means we can have our PERSONAL aggregator instead of using someone else algorithms and censorship. Sure at usenet time (or back at Xerox time where these concept was implemented for the first time in known history) the level of scoring and self-censoring technique was limited but nowadays it's a CENTRAL point. Usenet is a decentralized network no one own, no one can really censor at a whole and aggregators who are needed for anything high volumes are personal things, posts can be archived locally so they do not disappear and so on.
Coupled and integrated (like Gnus offer) with RSS feeds mails and in the case of Gnus also HN (nnhackernews backend) or Reddit (nnreddit) we can have a CONSISTENT and LOCAL UIs for ALL our public information/communication infra in a robust yet simple manner. That's the classic nuclear war resilient internet vs the modern centralized and censored web.
Oh BTW usenet today it's almost abandoned, but some have rediscovered it for mostly piracy as an alternative to bittorrent. It's relevant because it means that while normally binary groups in a modern world are a bit odd they perform well enough for such big file sharing usage.
Or, long story short: evolving these tools we can have a modern classic desktop who happen to be a human PERSONAL exobrain, work desk, tool, with the human at the center. With the modern web and relevant WebVMs we get instead modern dumb terminals of modern mainframes.
Do you prefer owning nothing "and being happy" like the infamous WEF/2030 video OR you prefer own your small slice of the world peer between peers?
That's the dream, isn't it.
So if there are 100 people whose messages you willingly watch, how do you discover new people? (Maybe you do so organically when they mention them?)
And if you _do_ allow yourself to taste from the firehose of unfiltered messages, a single personal list of users/messages wouldn't scale and you have the classic spam problem. Do you share your rules with others (and form an aggregation like https://www.dnsbl.info/)?
Curious what you think about https://atproto.com/:
Algorithmic choice
Control how you see the world through an open market of algorithms.So how to discover people? Well, in Usenet that happen at a slow but effective peace: you start to choose some groups you think to be interested in, the total number of groups is not that high and names are sufficient for a mere full-text search in a hierarchy. You start participate and others members write about some other groups "you might ask there", "try here" etc. In modern terms this is Reddit subs with a bit more discoverability and less abandoned groups. Since Reddit today seems to be effective enough that skilled people add "reddit" to most google search queries to find better contents...
Then the spam problem: at Usenet eternal september time antispam was limited, nowadays simple filters suffice AND we can import another piece of neglected IT evolution: the concept of PGP/GNUPG with their chain of trust. Or some will keep rotating usernames, some will keep them for decades. Those can share public keys signing each others in a classic chain-of-trust exchanging spam data automatically from their own client. Such approach is limited, but it's still better than actual "lists" for instance. It's still a cohort of people who decide BUT such cohort is not a for-profit company.
Long-story short even with current progresses expert systems are FAR to be near the human selection quality and actually it's possible to get partitioned human selection shared spontaneously to others. This is probably slower in discovering new stuff but bring most high quality results and avoid certain derives, so in the end is better. Actual volumes of posts often named "infodemia" needs to slow down and going up in quality. Eternal september do not show the limit of Usenet but the limit of a certain tech, we can plug-in more to surpass them and keep evolving instead of keeping reinventing the wheel with company-startup-alike experiments who keep duplicating similar concept just varying a little bit.
Competitive means from Oracle/CIA "different teams one against/semi-isolated to the others" produce some results, but a classic endless evolution of Lisp/Smalltalk systems the history prove produce MUCH better evolution because do not stop diversity, but allow diversity to merge and emerge. ALL "recent" IT evolution prove that countless times. ALL "recent" scientific trends do the same.
Some even gate mailing list mail into a news server like INN so that they can keep using their favorite newsreader. Thankfulyl, with Gnus, that isn't necessary.
I started to aggressively use newsletters and digests of all sorts, for one. Politics, technology, what to stream. It is a SUCH better experience and a time saver.
If not, then Twitter, Mastodon, etc. all seem to have a somewhat strong notion of identity.
Some of us would sign Usenet messages with PGP/GPG to deal with this. Lots of user didn't know what to do with the "geek code block" at the bottom of the message.
One day identity verification will occur at the network level. Until then, we have all this half-baked shit.
Yes. Same was true of email back in the day.
but the endless flood of spam was too much, along with the ever growing risk of "illegal content." Just became way too risky, ISPs started dropping it, and the spiral continued.
No, it seems more likely that it's because they're easier to display on mobile, so mobile first design leads to using them everywhere.
Todays internet demographics and social media landscape though probably makes Eternal September pale into insignificance.
If you open up an HTTP POST api these days that publicly listed the amount of crap that will be posted to it is legion. You'll have spammers and spammers bots hit it endlessly. You'll have broken scripts pound at it till the end of time. You'll have clever users figure out how to turn it into a data storage API. You'll have every dark thing that hides in the shadows of humanity use you as its new cave. If you allow binary uploads then that API is now a porn site or serving warez. The people pushing the most questionable stuff will come from a vast range of proxied IPs and infected jumpboxes.
It's been some years now since I've been in charge of managing servers that require public facing internet presence and I cannot tell you how glad I am because of it. Attempting to maintain operational integrity on the open internet is like attempting to maintain structural integrity in a blast furnace.
How is that desirable? Why would you want to finish reading something that might already be obsolete?
That "maybe obsolete" state is -your- reading state.
Visiting Usenet today though will only lead to sadness, unless you're looking for something you shouldn't be or enjoy endless spam.
They were much more powerful than any other forum interface I've seen since. Though the very compact thread views like in `old.reddit.com` and HN are nice.