However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
People who are willing to drop $20k on a computer might not be affected much tho.
They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
It's also a nightmare from any sort of privacy perspective, in a world that's already becoming too much like a panopticon.
https://us.ugreen.com/collections/usb-c-hubs - these docks only require a single USB port to connect to. That could be a SBC working as a handheld. These docks could end up being the largest cost component in the new era of all-in-ones. UGreen could be the next Apple as screens and processors snap-on to these hubs, in addition to their own range of power banks and SSD enclosures. Their quality is high too.
In fact, I would go so far as to say we are entering a tinkering culture, and free-energy technologies are upon us as a response oppressive economic times. Sort of like how the largest leaps in religious and esoteric thought have occurred in the most oppressive of circumstances.
People will reject their crappy thin clients, start tinkering and build their own networks. Knowledge and currency will stay private and concentrated - at least at first.
This will likely extend further and further, more into the "normie" territory. MS Windows is, of course, the thing that keeps many people pinned to the x64 realm, but, as Chromebooks and the Steam Deck show us, Windows is not always a hard requirement to reach a large enough market segment.
This what always happens in capitalism. Scarcity is almost always followed by glut
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
Running Electron apps and browsing React-based websites, of course.
I wonder if there’s a computer science law about this. This could be my chance!
The constant increases in website and electron app weight don't feel great either.
Most affordable laptops have exactly that, 16gigs of ram and a terabyte of storage. Think about THAT!
For word processing, basic image manipulation, electron app (well...) even the "cheap" Macbook Neo is good enough, and it's a last year phone CPU. But that's not enough for a lot of use case.
That's "non powerful" to you?
This absolutely boggles my mind. Do you mind if I ask what type of computing you do in order to justify this purchase to yourself?
I'm also into motorcycles. Before I owned a house with a garage, I had to continuously pack my tools up and unpack them the next day. A bigger project meant schlepping parts in and out of the house. I had to keep track of the weather to work on my bikes.
Then, when I got a house, I made sure to get one with a garage and power. It transformed my experience. I was able to leave projects in situ until I had time. I had a place to put all my tools.
The workstation is a lot like that. The alternative would be renting. But then I'd spend a lot of my time schlepping data back and forth, investing in setting things up and tearing them down.
YMMV. I wouldn't dream of trying to universalize my experience.
I would bet it continues to be more affordable to buy reasonable specs with current consumer hardware, rather than buying a top system once.
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.
No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.
As someone who just bought a completely maxed out 14" Macbook Pro with an M5 Max and 128GB of RAM and 8TB SSD, it was not $10k, it was only a bit over $7k. Where is this extra $3k going?
Main-frame (thin) -> PC (fat) -> Internet/Cloud (thin) -> Mobile (fat) -> AI (thin)
I expect this to continue until the next technology transition.
In each of these shifts, and there have been others, things are not completely fat or thin, more of an in-between state but leaning to local vs cloud.
This is where I think current hackers should be headed. I grew up with lots of family who were backyard mechanics, wrenching on cars and motorcycles. Their investment in tools made my occasional PC purchase look extremely affordable. Based on what I read, senior mechanics often have five-figure US dollar investments in tools. Of course, I guess high quality torque wrenches probably outlast current GPU chips? I'd hate to be stuck making a $10K investment every 24 months on a new GPU . . .
I have been renting GPU resources and running open weight models, but recently my preferred provider simply doesn't have hardware available. I'm now kicking myself a little for not simply making a big purchase last fall when prices were better.
I've replaced transmissions, head gaskets, and done all work for our family cars for two decades based on a Costco toolkit, and 20 trips to the autoparts store or Walmart when I needed something to help out.
Maybe I'm being a little forgetful that yes I bought a jack, and Jack stands, and have a random pipe as a breaker bar, and other odds and ends. But you can go very far for $1k as a DIYer.
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
If you haven't tried out a desktop CPU in a while, I highly recommend you giving it a try if you're used to only using laptops, even when in the same class the difference is obvious.
For CPU-bound tasks like compiling they’re not that different. For GPU tasks my desktop wins by far but it also consumes many times more power to run the giant GPU.
If you think laptops are behind consumer desktops for normal tasks like compiling code you probably haven’t used a recent MacBook Pro.
A 300W GPU released in 2025 is about 10x M5 perf. The difference is going to be smaller for CPU perf, but also not close.
This is not true. The recent MacBook Pros are every bit as fast as my Zen 5 desktop for most tasks like compiling.
For GPU there is a difference because both are constrained by thermal and power requirements where the desktop has a big advantage.
For CPU compute, the laptop can actually be faster for single threaded work and comparable for multi threaded work.
Anyone claiming laptop CPUs can’t keep up with desktop CPUs hasn’t been paying attention. The latest laptops are amazing.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.
That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.
I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.
That’s for everyone
Never really used it all, usually only about 40%, but it's one of those better to have than not need, and better than selling and re-buying a larger memory machine (when it's something you can't upgrade, like a Mac or certain other laptops)
It really feels like we're slowly marching back to the era of mainframe computers and dumb terminals. Maybe the democratization of hardware was a temporary aberration.
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
Tech feels increasingly fragile with more and more consolidation. We have a huge chunk of advanced chip manufacturing situated on a tiny island off the coast of a rising superpower that hates that island being independent. Fabs in general are so expensive that you need a huge market to justify building one. That market is there, for now. But it doesn't seem like there's much redundancy. If there's an economic shock, like, I dunno, 20% of the world's oil supply suddenly being blockaded, I worry that could tip things into a death spiral instead.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
Boy, am I glad I decided to get the whole 128GB before RAM prices spiked!
And I fear they will be equally confused and annoyed by disposing of all of them.
Are you kidding? Apple's mobile chips are now delivering perf that AMD & intel desktop never could or did.
Most applications don't make aggressive use of the SIMD instructions that modern x86 chips offer, thus you get this impression. :-(
I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)
What are you talking about?
My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.
If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.
The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.
It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.
People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.
Open source efforts need to give up on local AI and embrace cloud compute.
We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.
When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.
If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.
If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.
Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.
An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.
That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.
We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.
Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?
I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)
Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.
- Our career is reaching the end of the line
- 99.9999% of users will be using the cloud
- if we don't have strong open source models, we're going to be locked into hyperscaler APIs for life
- piddly little home GPUs don't do squat against this
Why are you building for hobby uses?
Build for freedom of the ability to make and scale businesses. To remain competitive. To have options in the future independent of hyperscalers.
We're going to be locked out of the game soon.
Everyone should be panicking about losing the ability to participate.
Play with your RTXes all you like. They might as well be raspberry pis. They're toys.
Our future depends on our ability to run and access large scale, competitive, open weights. Not stuff you run with LM Studio or ComfyUI as a hobby.
Here's my retort: https://news.ycombinator.com/item?id=47543367
Personally I think it will be a big headache for HP, people can be hard on laptops and HP is already not excited about consumer support (i.e. mandatory 15 minute wait time for support calls). But if they make it work, I think there's probably a good number of people who feel like they need a laptop but don't care so much about the specifics and want to keep their costs low (as all of their costs appear to be rising right now).
For consumers who don't replace their laptops on a schedule it makes less sense.
Competition.
RAM was this price some years back, and yet last summer/fall it was at an all-time low.
Will we continue to see steady improvement in top quality CPU/GPUs? Would they even bother releasing consumer versions of ram faster than DDR5?
The current generation is insanely fast. I am planning to get a gaming PC for my wife and a mix of gaming + workstation PC for me (or maybe just base it off of the Ryzen 9950x3D and call it a day). We plan to hold on to them for 10 years.
I don't care if anything 6x faster comes out. For what I need the current generation is even an overkill.
I'd even go as far as to say that it would be quite OK if that's the very last generation and no further hardware development ever happens.
Happened before, will happen again.
(A large factor here is, obviously, the cloud. With photos, documents, e-mail, IMs, etc. all hosted for cheap or free on "other people's computers", the total hardware demands on the end-user computing device is much less. Think storage, not just RAM.)
It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once. I have a separate company laptop for work, and I occasionally turned on my PC, but it turns out that a foldable phone is good enough to do everything on personal side I'd normally use a laptop for. So here I am, with my primary compute device I don't have full control over - and yes, I'm surprised by this development myself, and haven't fully processed it yet.
It's a deeply flawed comparison, because many of the things we do with a phone now wasn't something we'd do at all with the computers we grew up with. We didn't pay at the grocery store with a computer, we didn't buy metro tickets, we didn't use it to navigate (well, there was a short period of time where we might print out maps, but anyway..)
When I grew up, I feel like our use of home computers fell into two categories:
1. Some of us kids used them to play games. Though many more would have a Nintendo/Sega for that, and I feel like the iPhone/iPad is a continuation of that. The "it just works" experience where you have limited control over the device.
2. Some parents would use it for work/spreadsheets/documents ... and that's still where most people use a "real" computer today. So nothing has really changed there.
There is now a lot more work where you do the work on services running on a server or in the cloud. But that's back to the original point: that's in many cases just not something we could do with old home computers. Like, my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before, and arguably isn't possible without a server/cloud-based infrastructure.
Phones/tablets as an interface to these services is arguably a continuation of like those old dumb terminals to e.g. AS/400 machines and such.
> It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once.
I do agree, I am in a similar situation.
There is a reason I have a server in my basement - it lets me edit files on my phone (if I must - the keyboard is and screen space are terrible compromises but sometimes I can live with it), laptop (acceptable keyboard and screen), or desktop (great keyboard, large screen); it also lets me share with my wife (I haven't got this working but it can be done). I have nearly always had a server in my house because sharing files between computers is so much better than only being able to work on one (or using floppies). The cloud expands my home server to anywhere in the world: it offloads security on someone else, and makes it someone else's problem to keep the software updated.
There is a lot to hate about the cloud. My home servers also have annoyances. However for most things it is conceptually better and we just need the cloud providers to fix the annoyances (it is an open question if they will)
iPad awas the perfect device for her (I've touched one perhaps twice, in my entire lifetime).
Meanwhile my much more expensive laptop mostly interfaces with applications that primarily exist on servers that I have no control over, and it would be nearly worthless if I disconnected it from the Internet. Your central point is right, the economics are concerning, but I think it's been a ship slowly sailing away that we're now noticing has disappeared over the horizon.
Windows PCs were essentially a hacker's paradise and almost all of my friends assembled their PC or let their PC get assembled by a friend.
There was so much variety in the components, in fact you carefully had to adjust mainboard and its chipset with the CPU and GPU as well as the RAM type. This was a lot of work, since all components had downsides and there were hidden gotchas, like for example the huge and looming CPU fan of later Intel CPUs.
If you not went for a large tower PC, you were essentially doomed. CPU fan and GPU could overlap, number of slots were a thing also whether there were enough host slots available.
Bus system, cable length, jumper settings - so many ways to make your hardware bite you.
But at least it felt even after many years as your system, that ran locally. There was a lot of protest, when Microsoft started online activation. This marked the inflection point at which autonomy of your system suddenly eroded. Also downloading huge driver updates in order to get your NVIDIA GPU working - you needed a permanent online connection which wasn't needed earlier.
The niche is still there, probably as big as it was before. For example, as I grew weary of being subject to services I have little control over, I set up my own home server using a refurbished PC. It has been an amazing journey so far. But I don't think a normie would ever get interested in buying a refurbished Dell, install Debian on it, and set up their own services there.
As long as there is a niche of people interested in buying their own computers, there will be companies willing to fill that niche.
And a large number of those people jump at getting them an iPad instead of the perpetual tech support required.
This has been a thing for so long that the jokes were already old in 2005 - 21 years ago! https://www.penny-arcade.com/comic/2005/12/12/one-day-in-the...
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
Outside of the obvious economic effect of the dot com boom - the creation of near infinitely scalable high margin online businesses - there was a secondary effect on consumer electronics, with a massive growth in demand for networked devices; there was then much more of a balance between the hardware growth in the network infrastructure and data center worlds as well as in desktop and mobile.
The AI boom’s hardware impact is much more skewed, as this article details.
Yes but these Chinese firms are a tiny share of the overall RAM/SSD market, and they'll have the same problems with expanding production as everyone else. So it doesn't actually help all that much.
* Chinese firms finance through different banks and investors than current ram producers
* A company with a mission statement of consumer ram won’t have their supply outbid by data centers
* Chinese manufacturing has more expertise in scaling then any other manufacturing culture
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
Not everyone benefited. Market globalism wasn't particularly kind to the global south, and the specific mandates that the WTO enacted on countries in latin america / africa (Washington Consensus) greatly increased local wealth disparities despite visibly growing GDP for a time.
America profited handsomely because for most of the past 30 years, it was where the (future) transnational conglomerates were based. These companies stood to benefit from the opening up of international markets. Now that these companies are being out-competed by their asian counterparts, instead of going back to the drawing board and innovating they are playing the "unfair trade practices" card and of course the current administration is on-board with it.
Globalisation is not going anywhere, but America is increasingly alienating itself from allies who it could stand to benefit from.
We're a clown show and we don't deserve to have friends until we get our shit together.
The largest upward transfer of wealth in Earth's history.
> was able to bring us.
A useless race to the bottom.
> that can be learnt about what we had
Rebadged imperialism. If you constantly screw native populations out of the value of their resources you will lose everything you build with those resources. Literally the lesson: make fair deals or die.
The uber-profit-driven enrichment of a very small group of people at the cost of entire communities or even populations should be rewarded with, like you say, death.
Me too, but without all the slavery this time please. It'll never work if some actors are willing to abuse their workforces to keep prices low as they do.
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
With only a few kilobytes of code, you could send a UDP packet directly to your phone, with an app you "wrote" with just a few lines of code (to receive, without auto-confirmation).
More seriously and more ironically, at the same time, we've now reached a strange time where even non-programmers can vibe-code better software than they can buy/subscribe to - not because models are that good, or programming isn't hard, but because enshittification that has this industry rotten to the core and unable to deliver useful tools anymore.
What do I gain if more developers take this approach? Lightning fast performance. Faster backups. Decreased battery drain => longer battery service lifetime => more time in between hardware refreshes. Improved security posture due to orders of magnitude less SLOC. Improved reliability from decreased complexity.
It’s been convenient that we can throw better hardware at our constraints regularly. Our convenience much less our personal economic functions is not necessarily what markets will generally optimize for, much like developers of electron apps aren’t optimizing for user resources.
Less bloat is 100% always a good thing, no matter what the market conditions are.
- https://xn--gckvb8fzb.com/projects/
Their github repos:
They even built a BBS-style reader client that supports Hacker News:
https://github.com/mrusme/neonmodem
I miss the days of the web being weird like this :-)
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
Whenever I read about fusion, I get reminded of a note in the sci-fi book trilogy The Night's Dawn. In that story, the introduction of cheap fusion energy had not cured global warming on Earth but instead sped it up with all the excess heat from energy-wasting devices.
What matters is not what we don't have, but how we manage that which we do have.
There are several challenges, not least of which is storage. We have considerable leakage in most of our current helium storage solutions on earth because it’s so light. Our national reserves are literally in underground caverns because it’s better than anything we can build. Space just means any containment system will need to work in a wider range of pressure/temperatures.
Non-helium hard drives are basically limited by their bearing spin hours. If one only spins a few hours a week, it'll probably run for decades. Not so with helium.
Doing some googling yields an estimated cost of about $25,000 per kg. I can see why extraction from wells is preferred.
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
0.03 kW * 24 h * 365 d * $0.18 = $47.30/year
Even CPU TDP is not an accurate measure. My latest AMD CPU will pull more than it’s rated “TDP” under certain loads.
TDP is a thermal measurement, it's how much heat energy your heatsink and fan need to be able to dissipate to keep the unit within operational temperatures. It does not directly correlate to the amount of electricity consumed in operation.
New systems idle at something like 25 Watts according to a lazy search. So 49-25=24W. That works out to $15/year hypothetically saved by going to a newer system. But I live in a cold climate and the heating season is something like half the year. But I only pay something like half as much for gas heat as opposed to electric heat. So let's just knock a quarter off and end up with 15-(15/4)=$11.25USD hypothetically saved per year. I will leave it here as I don't know how much the hypothetical alternative computer would cost and, as already mentioned, I don't care.
[1] https://forums.anandtech.com/threads/athlon-ii-x2-250-vs-ath...
I also do caching and distributed compilation with sccache.
HDD/SSD?
Pretty surprising to have this thing still be working 17 years later, unless it spent a good chunk of that in 'cold storage'.
I think I can count on one hand the total number of drives I've ever had fail - and four of those were in the warranty period, back in the 90s-2000s.
The server ran non-stop for the first 10 years. Motherboard, a 790, failed and upgraded to 880G. One memory stick failed, replaced by lifetime warranty (Kingston) but the pair I received was slower CL9-10-9 vs. 9-9-9 for the failed one. After 10 years my router and a rk3288 SBC took most of it's jobs. I moved most of the hard drives (7x 2GB Seagate ST2000DL and 1 spare) into a DAS (SATA RAID enclosure) connected directly to the router where they are still running. None failed. The server bacame an offline backup. I started it weekly to sync. Last week I replaced it with a rk3588 ITX board - not because it failed, but because I wanted to explore / play with the new ARM CPU.
The desktop is also still working. I bought it second-hand a few years after the first. It was used at least 4h every evening and at least 10h every weekend. I'm still using it right now. One HDD failed - it was a 120GB PATA Seagate from ~2004 IIRC. No data loss, it was in RAID1. One GPU failed, a GTS 250, upgraded to GTX 970, still working. I'm going to keep using it for at least 5 more years, possibly more. Firefox no longer supports Win7 and I'm in the process of migrating to Linux. Total Commander (I'm a user since Win31) and file associations are holding me back. xdg-open is... absolutely horrible.
Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up
FWIW might want to check https://github.com/wg-easy/wg-easy to remove yet another managed elsewhere piece of your setup.
So yeah, it's fun. But don't under-estimate that time, it could easily be your time spent with friend or family.
I have a homelab that supports a number of services for my family. I have offsite backups (rsync.net for most data, a server sitting at our cottage for our media library), alerting, and some redundancy for hardware failures.
Right now, I have a few things I need to fix: - one of the nodes didn't boot back up after a power outage last fall; need to hook up a KVM to troubleshoot - cottage internet has been down since a power outage, so those backups are behind (I'm assuming it's something stupid, like I forgot to change the BIOS to power on automatically on the new router I just put in) - various services occasionally throw alerts at me
I have a much more complex setup than necessary (k8s in a homelab is overkill), but even the simplest system still needs backups if you care at all about your data. To be fair, cloud services aren't immune to this, either (the failure mode is more likely to be something like your account getting compromised, rather than a hardware failure).
Sure - self hosting takes a bit more work. It usually pays for itself in saved costs (ex - if you weren't doing this work, you're paying money which you needed to do work for to have it done for you.)
Cloud costs haven't actually gotten much cheaper (but the base hardware HAS - even now during these inflated costs), and now every bit of software tries to bill you monthly.
Further, if you're not putting services open on the web - you actually don't need to update all that often. Especially not the services themselves.
Honestly - part of the benefit of self-hosting is that I can choose whether I really want to make that update to latest, and whether the features matter to me. Often... they don't.
---
Consider: Most people are running outdated IP provided routers with known vulnerabilities that haven't been updated in literally years. They do ok.
Most of this I didn't for many years because it is not my core competence (in particular the security aspects). Properly fleshed-out explanations from any decent AI will catapult you to this point in no time. Maintenance? Almost zero.
p.s. Admittedly, it's not a true self-hosting solution, but the approach is similar and ultimately leads to that as well.
For example I could never setup Traefik correctly because I just found it too complicated. Now I have Claude I finally got it setup just the way I want it - the ROI on my Claude subscription has been off the scale!
The obvious downside is that I might not really know what exactly I’m implementing and why. I do read all the explanations that Claude gives but it’s hard to retain this information. So there are pros and cons to relying on AI for this kind of stuff I suppose
If anyone reading this has struggled with servers accumulating cruft, and requiring maintainance, I recommend NixOS.
Combine that with deploy-rs or similar and you have a very very stable way to deploy software with solid rollback support and easy to debug config issues (it's just files in the ./result symlink!)
They need to get over it.
Pick up some Ansible and or Terraform/tofu and automate away. It can be easy or as involved as you want it to be.
I've been renting a VPS for 15-20 years from a small provider. It runs a webserver, gitea instance, matrix homeserver, and a bunch of other things, and I spend maybe an hour or two per month maintaining it. Add a few non-recurring hours if I want to set up something new or need to change something big.
Self hosting is not hard. It's not scary. It's not a security nightmare. All of that is just FUD.
Is that likely? History says it's inevitable, but timeframe is an open question.
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...
The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
That's not true at all.
There are a lot of people willing to buy smartphones with small screen or smartphones with Linux or any other OS than iOS or Android.
But those people are not enough to justify the gigantic initial investment that is necessary to provide viable products in this market. And the existing actors aren't interested in those niche.
That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.
Until they come for the HW.
Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.
They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!
All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.
This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.
For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.
Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.
It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.
The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).
To be fair, I only started this because the dock I got had 10G - https://www.owc.com/solutions/thunderbolt-pro-dock and I saw some 10G cards on eBay cheap and my old Nortel switch had a 10G uplink and ... well, you know how it goes!
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
It will be scarcity mindset from here on out; will always buy the top tier thing .
In that sense, I suppose you could still make it work. Our society celebrated surrendering ownership of media to iTunes and Steam for our convenience, whittled down online content that didn't make us feel good, limited which applications we could install on our phones in the name of security and privacy, and eliminated our anonymity to save the kids. At this point, removing the hardware is the least surprising step, because as Captain Beatty says, "if you don’t want a house built, hide the nails and wood."
Or perhaps you were thinking of Brave New World.
"don't create the torment nexus, etc."
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.
The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.
Ah, the paradoxes of modern software.
Even remote VDI instances are accessed through a web page now.
On top of that add all the corporate bloatware and securityslop-ware, and suddenly my "thin" client is using 60% of 10 available cores and 85% of 16GB or RAM.
I don't think it needs an explanation on how insane that resource usage is.
Maybe it was a few years ago, I had to download a phone app just to try to restore delivery notifications on my Amazon echo. They no longer have a web interface to control it. Worse than that, I had to ask the AI to do it in said app.
Newer graphics hardware is pointless to me. The expensive new techniques I find incredibly offense from an interactivity standpoint (temporal AA, nanite & friends). I run Battlefield 6 at 75% render scale with everything set to low. I really don't care how ass the game looks as long as it runs well. I much more enjoy being able to effectively dispatch my enemies than observe aesthetic clutter.
https://about.bnef.com/insights/commodities/ai-data-center-b...
But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.
Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.
Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.
M1 Apple Silicon MacBook Airs are still good computers 5+ years after release.
Many games are still playable (and being released on!) the PS4, which is almost 12 years old.
The iPhone 15 pro has 8gb of RAM which will likely be sufficient for a long time.
Don't get me wrong, this whole parts shortage is exceptionally annoying, but we're living in a great time to weather the storm.
War aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.
Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.
Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
{ "Official Church of Scientology: Difficulties on the Job - Online Course", "Ask HN: How could I safely contact drug cartels?", "The internet used to be fun", "am I boring - Google Search", "what is punycode - Google Search", "arguments for HN comment - Google Search", "how to hack coworker's phone - Google Search", "censorship on hacker news - Google Search", "rust programming socks - Google Shopping", "Adult entertainment clubs - Google Maps", "Pick up lines suggestions - ChatGPT", "Online debate argument suggestions - ChatGPT", "The Flat Earth Society", "Amazon.com: taylor swift merch", "Amazon.com: waifu pillow", "/adv/ - topple government - Advice - 4chan", "r/wallstreetbets on Reddit", "Infowars: There's a War on For Your Mind!", "birds aren't real at DuckDuckGo", "Lincoln MT Cabins For Sale - Zillow", "The Anarchist Cookbook by William Powell | Goodreads", "Fifty Shades of Grey | Netflix", "jeff bezos nudes - Google Image Search", "zuckerberg nudes - Google Image Search", "bigfoot nudes - Google Image Search", "Rick Astley - Never Gonna Give You Up - YouTube", "Pennsylvania Bigfoot Conference - Channel 5 - YouTube", "Linus goes into a real girl's bedroom - Linus Tech Tips - YouTube", "MrBeast en Español - YouTube", "FTX Cryptocurrency Exchange" }
I thought it was clever. But it also seems ham-fisted, and in poor taste.
> You may want to consider linking to this site, to educate any script-enabled users on how to disable JavaScript in some of the most commonly used browsers. The following code uses scare tactics to do so.
> When added to your website, it will change the icon and the title of your website's tab to some of the most unhinged things imaginable once the user sends your tab to the background. Upon re-activation, the script will display a popover to the user informing them about the joke and referring them to this initiative.
You are also completely speculating on the intent. Less drama please.
Does all this not apply to businesses buying computers for their employees?
Enterprise business from small, medium to large get laptops or use mobile application and online SAAS.
There entire PC industry is for enthusiasts and tiny segment of the worlds computing needs. The laptop variation has already been eating into PC market.
In today’s world, it’s just not practical to own a PC unless you are gamer. And for gamers, it’s just better to get console. And for developers, their is more money to be made selling games on console then PC.
In the end, from business, to revenue generation stand point, custom PC industry is just a legacy of old computing world. As I type this, custom PC is more of a “marketing segment” for Nvidia for example to upsell Nvidia cloud offering - i might be stretching too far - but that basically is the point.
The author with his custom designed website would be a much better spot to implement such a rent hardware scheme compared to the likes of HP.
uBlock Origin has prevented the following page from loading:
https://xn--gckvb8fzb.com/hold-on-to-your-hardware/
This happened because of the following filter:
||xn--$document
The filter has been found in: IDN Homograph Attack Protection - Complete BlockageThen maybe you won't be able to buy a computer, but you'll be able to buy a server on a datacenter for half the price of one. Yours, one that if you wanted you could drop by the datacenter and ask for it. But you won't because you wouldn't be able to do nothing with it. It's only useful in the server rack, so you'll buy it and leave it there, maybe after one or two years, depending on contract, you'll have to pay rent, but it'll be incredibly cheaper than buying a pc and hosting yourself.
I don't know. Or maybe it's just a temporary shortage and we are just overreacting.
If the AI boom slows, it will free up manufacturing capacity for the consumer supply chain, but there is going to be a long drought of supply.
I’m doing more with a decade old GPU, which was manufactured before “Attention is all you need“, than I could 5 years ago, when quantization techniques were implemented.
I’m holding on to my 32 bit machines.
Most linux distributions dropped support for them (for good reason). But at the end of the day these machines are a fabric of up to ~ 4 billion bytes that can be used in a myriad of ways, and we only covered a fraction of the state space before we had moved on.
Phones and tablets only get replaced when they die.
Why should I throw away stuff that still works as intended?
I would specifically add, whatever you have, or whatever you choose to buy, it would greatly benefit you to ensure a degree is Linux compatibility to ensure its lifespan can be extended further than the greed enthusiasts at MS, Apple, and Google would like you to. They will be facing the same declines in purchasing habits and are further incentivised to assert their ownership over what you might mistakenly consider your devices.
In my country for offline store purchase of USB HDD only 4Tb seagate variant available, thats 15000 in pur currency thats almost 1.5 month salary in private sector
Any higher size and have to import, and and forex applied, prices goes upto 4 0's , when i read people on youtube or blogs saying they rotate 15Tb and higher on their nas raids, that seems just dream for use never to fulfill
I must admit that my workflow it's not that heavy.
The favicon changes every time you switch away to something different with various tab names (ranging from porn, to right-wing news, to 4chan and I'm sure more).
All this author has convinced me to do is block their website.
I also don't agree at all with the premise of the article so I don't imagine I'm going to be missing much by not seeing this site again.
These are entire datacenters build for one thing - and they're water-cooled and integrated so tightly that it's unlikely you can do much with any parts or part of the setup.
The difference between buying an off-lease Dell server and trying to buy part of a supercomputer - which is what these really are. They're not datacenter; they're custom-built supercomputers.
Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.
If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.
They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.
Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
https://web.archive.org/web/20180513133803/https://www.techr...
Prices went down again after that.
To me this is just a temporary swing in the other direction - they're riding the gravy train while they can, because once it ends it's back to low prices.
At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.
I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.
The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.
However, there is another reason to look after and hang onto to certain types of products long-term.
Tariffs.
If the trade barriers that the Trump administration has put up remain long-term, it fundamentally changes what can be built. High volume items (like RAM) are the least likely to be affected. Low-volume, high performance items are what are threatened. Say you're building a very specialized, very low demand item that's simply-the-best. You're probably going to source the best components and materials from several countries, build it in one place, and then ship it globally. You amortize the cost of tooling, etc. across the entire global market.
If a few countries throw up trade barriers, as the U.S. has done, your material costs go up and your access to markets decreases. People on the other side of those trade barriers may suddenly not be able to afford your products. Supply gets more expensive and demand drops. What was marginally profitable in the old world order becomes uneconomic in the new order. Such items aren't going to be magically on-shored to the U.S.. They're just not going to be made anymore.
If you own something that's niche and barely profitable to make, that's what you should look after and take care of, because more of it might not be made for a while if trade barriers don't come back down.
----------
[1]https://research.google/blog/turboquant-redefining-ai-effici...
Most commentators seem to have missed that the TurboQuant paper was posted in April 2025:
https://arxiv.org/abs/2504.19874
So it's been public knowledge for almost year, and internal Google knowledge for longer. Did Google reduce its capital expenditure plans over the past year? Let's ask Google:
Google is expecting to see a capex of between $175-185 billion in 2026, approximately double that of 2025.
https://www.datacenterdynamics.com/en/news/google-estimates-...
Cha cha cha ...
personally been stretching the most life out of my machines. particularly because there has been some supply-demand "issue" happening ever since i could afford to.
my previous computer was a laptop without any tensor cores. it was bought right on the onset of lockdown-related supply chain issues. i was ok to stretch it beyond the better half of a decade before it died on me.
now had to build a computer out of necessity while paying 5x for ram. it took a lot to suppress the internal rationalisation of waiting it out for overall costs to improve. but in this decade, they almost never have!!
stepping back, we forget to appreciate how the cost of computing against cost of living have been so good to us for almost two decades. now thanks to opportunistic greed and real issues, it is coming back to the older times. one could still, for the price of a month's rent*, afford a decent computer which can last years.
These are super interesting problems. However, it seems like selection pressures, or just pure greed, attracts people to the "easiest" solution: pure domination. You don't need to care about any of these (well, you still do eventually, but in the minds of said people) if you just have pure utter control over every part of the stack.
There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.
Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
whatever happens it's crazy and hope AI madness is worth it
For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.
Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.
This is pure Capitalism.
If one is in general against Capitalism, yes, one can complain.
But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.
Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.
And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.
Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.
AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.
The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.
Even if we have to wait for this old world cabal to die and fade away, time is still on our side.
Boomers are stupid for using time as a weapon.
I’m chillin. Waiting for people to die while growing my businesses.
Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.
You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.
Edit: That’s an example. It goes beyond AI. and...:
Liberty goes beyond that.
https://www.reddit.com/r/LocalLLaMA/comments/1s0czc4/round_2...
- "Stare into this hole to verify your age.
- "Stick your finger in the box.
- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation accelerated internet."
- "Sync ALL of your usernames and passwords into this secure enclave."
Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.
My only question is which country will control the behavior of these chips.