Eventually the music will stop when the easy money runs out and we'll see how much people are truly willing to pay for AI.
Everything was cheap. Samsung sold SSDs at a loss that year.
TSMC and other suppliers did not invest as much in cap ex in 2022 and 2023 because of the crash.
Parts of the shortage today can be blamed by those years. Of course ChatGPT also launched in late 2022 and the rest is history.
[0]www.trendforce.com/presscenter/news/20221123-11467.html
"but this time is different, it's not a bubble, there's real value there"
Economists use the term “bubble” to describe an asset price that has risen above the level justified by economic fundamentals, as measured by the discounted stream of expected future cash flows that will accrue to the owner of the asset.
I think there's little argument that is happening, the question is more about to what extent is it a bubble.
The entire global software industry is worth less than $1 trillion dollars. Or in other words smaller than the current valuation of just OpenAI + Anthropic.
Planned capital investment this year by the Magnificent 7 alone is $600B. More than 2/3 of the total global software industry. In one year. Good luck buying any computer hardware this year, there will be a shortage of everything, including electricity.
It's a bubble. But when does the music stop?
It's always been cycles of cheap production and then human created demand or catastrophes to reduce supply and increase prices back up again.
Not to mention that without enough competition, you can just raise prices, which, uh (gestures at Nvidia GPU price trends...)
They didn't spin up additional mask production b/c they knew the pandemic would eventually pass. They learned this lesson from SARS.
Not maxing out production during spikes (or seasonality) in demand is a key tenet of being a "rational economic actor".
But as it is it's not like they made any bad decisions either.
Thus far, we've not found that point.
Very good.
Looks like all the money reserves big companies have been sitting on are gone. Circular money deals are in full swing & now it looks like some companies are now looking for loans.
Not sure how much longer this can go on until it comes crashing down.
And if they were running 24/7, maybe setting up another factory or line will avoid some of the 24/7 scheduling.
I don’t quite follow the narrative like yours about nation states and investors. There is certainly an industrial bubble going on and lots of startups getting massive amounts of capital but I here is a strong signal that a good part of this demand is here to stay.
This will be one of those scenarios where some companies will look brilliant and others foolish.
These contracts are then transferrable. The manufacturer can start work on a factory knowing they'll get paid to produce the drives.
If the AI boom comes to an end, the manufacturer is still going to get paid for their factory, and if the AI company wants to recoup costs they could try to sell those contracts back to the manufacturer for pennies on the dollar, who might then decide (if it is more profitable) to halt work on the factory - and either way they make money.
Every year a few farmers realize they are contracted to deliver more grain than they have in their bins and so have to buy some grain from someone else (often at a loss) just to deliver it. This isn't a common problem but it happens (most often the farmer is using their insurance payout to buy the grain - snip a very large essay on the complexities of this)
You are also forgetting that the payback period on a plant is not a single year, it will be over many years and most likely no buyer is wanting to arrange purchasing that far out.
I don’t see how what you described sounds is set in reality even for “smart manufacturers”.
This sounds like economic dogma based on pointing at some future equilibrium.
I like the saying that goes something like "life is what is happens when you are waiting for the future". In the same way, it seems to me that equilibrium is increasingly less common for many of us.
Markets are dynamic systems, and there are sub-fields of economics that recognize this. The message doesn't always get out unfortunately.
> But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.
This feels like more dogma: find a convenient scape-goat: governments.
Time to wake up to what history has shown us! Markets naturally reflect boom and bust cycles, irrationality of people, and various other market failures. None of these are news to competent economists, by the way. Be careful from whence you get your economic "analysis".
.....
Cheap hard drives and ram, yay! Perhaps GPUs too!
Seeing the first mover succeed, every Tom, Dick and Harry wants to emulate. It distorts the price because people would pay premium for everything. Then there is surplus supply and no takers. People are caught with their pants down and things go for cheap.
This repeats ad nauseum. Whether it was building ISPs during early 2000s or the abundance of streaming service where every media company wanted one. Just because the corporate overlord doesn't want to look foolish for not following a trend.
So there is always use for more compute to solve problems.
Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.
Hardware just depreciates much much faster than fiber
The manfucaturing capacity expanded to meet the demand for new hardware doesn't (as much)If it drops for a year, they're likely to start shedding capacity, one way or another.
This is not an equivalent situation. The vast, vast majority of what's being produced for this bubble is going to be waste once it pops.
There's clearly easy/irrational money distorting the markets here.
No, I think it is real demand.AI will cause shortages in everything from GPUs to CPUs, RAM, storage, networking, fiber, etc because of real demand. The physical world can't keep up with AI progress. Hence, shortages.
AI simply increases computer use by magnitudes. Now you can suddenly use Seedance 2.0 to make CGI that would have cost tens of millions 5 years ago for $5.[0] Everyone is going to need more disk space to store all those video files. Someone in their basement can make a full length movie limited only by imagination. The output quality keeps getting better quicker.
AI agents also drastically increase storage demands. Imagine financial companies using AI agents to search, scrape, organize data on stocks that they wouldn't have been able to do prior. Suddenly, disk storage and CPUs are in high demand for tasks like these.
I think the demand for computer hardware and networking gear is real and is only the beginning.
As someone who is into AI, hardware, and investing, I've been investing in physical businesses based on the above hypothesis. The only durable moats will be compute, energy, and data.
"Compute" is capital investment; normal and comprehensible, but on a huge scale.
"Data" is .. stolen? That feels like a problem which has been dodged but will not remain solved forever, as everyone goes shields-up against the scrapers.
"Energy" was a serious global problem before AI. All economic growth is traded off against future global temperature increases to some extent, but this is even more acute in this electricity-intensive industry. How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?
The premise here is that if we use more electricity then we burn more carbon. And the media hates AI, so if anybody restarts any coal-fired power plant to run a data center anywhere, that's the story. But then there's this:
https://electrek.co/2026/01/28/eia-99-of-new-us-capacity-in-...
Nobody actually wants coal because solar is cheaper.
And data centers are a pretty good combination for this because the biggest problem with solar and wind is what to do during multi-day periods of low generation, but data centers have backup generators and would be willing to turn them on whenever the cost of grid power is higher than the cost of operating them. Running some gas turbines for a week every two years in exchange for stabilizing the grid and being able to run on renewable power for the other 103 weeks is a pretty good outcome for everybody, not least because that amount of grid stabilization would exceed their consumption, i.e. allow more renewables to be added to the grid than they're using. If they can shed 1GW of load when a 2GW (long-term average) solar farm is generating at 50% of typical capacity for a week, you can add that 2GW of solar to the grid and remove 1GW of fossil fuels even while the data center is increasing consumption by 1GW.
Billionaire. And they are definitely willing to make the trade.
If AI will permanently cause an increase in hard drives over the current growth curve, then WD, et al will build new capacity, increasing supply (and reducing costs). But this really isn’t something that is known at this point.
By the way, plenty of people on HN and Reddit ask if the demand is real or not. They all think there's some collusion to keep the AI bubble going by all the companies. They don't believe AI is that useful today.
Maybe now we will start to see the "optical" CPUs start to be a thing. Or the 3D disk storage,;or other ground breaking technology.
Real demand, sure, I agree, but maybe not retail or business demand; at the moment the "demand" is entirely VC demand.
It's a really distorted market which is to be expected in any bubble/hype phase. The current retail/business demand doesn't appear to exist at the price point these investments require - even at the low low cost of "free, gratis and for nothing", not enough consumers and businesses are signing up.
The ones really going all-in on AI are the slop-producers. I dunno if slop is enough to pay back the investment into AI - I mean, even the slop producers are going to realise that paying $200/m to produce something in 1/10th of the time is a race to the bottom because someone else on the same plan is going to do the same, but cheaper.
> The physical world can't keep up with AI progress. Hence, shortages.
I think the word "progress" is inaccurate there - the physical world supply product at the demand maintained by VC's money.
It's not "cannot keep up with progress", it's "cannot keep up with demand from VCs".
> The only durable moats will be compute, energy, and data.
That'll be a first :-) Physical commodities have never been moats on their own before.
And even if there were guaranteed to be non-deterministic, there is still lots of value in many aspects of content generation.
Sounds right
> there's no price that is too high for these supplies.
Are you saying even higher prices won't increase supply? I don't understand.
Why gamers must be the most important group?
The above was their prediction during the crypto boom and it turns out correct. I'm not sure how AI will turn out, but it isn't unreasonable to predict that AI will also move to dedicated chips (or die?) in a few years thus making gamers more important because gamers will be buying GPUs when this fad is over. Though of course if AI turns out to be a constant demand for more/better GPUs long term they are more important.
Gamers are not the only important GPU market. CAD comes to mind as another group that is a consistent demand for GPUs over the years. I know there are others, they are all important.
Besides, a 1080 had 8GB, a 5080 has 16GB. Double in 10 years isn't ground breaking. The industry put VRAM into industrial chips. It didn't make it to consumer hardware.
What games have had to deal with instead is inference based up-scaling solutions. IE using AI to sharpen a lower rest image in real time. It seems to be the only trick being worked on at the moment.
I can't think of anything useful crypto did.
The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
The revenues that nVidia has reported is based on what we hope we will achieve in the future so I guess the whole thing is speculation?
> The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
I think the value now comes on how we make a product of it, for example, like OpenClaw. Whether we like or not, AI is really expensive to train, not only in monetary value but also in resources, and the gains have been diminishing with each “generation”. Let's not forget we heard promises that have not been fulfilled, for example AGI or “AI could potentially cure cancer, with enough power”.
My point is that directly or indirectly all hardware companies depend on memory and storage. If AI companies fall this could have repercussions to the whole industry.
What if in the near future it is simply too expensive to own "personal" computers? What if you can no longer buy used computers from official channels but have to find local shops or sharpen up on soldering skills and find parts from dumps? The big techs will conveniently "rent out" cloud computer for us to use, in exchange of all of your data.
"Don't you all have cellphones?"
The worry is that at some point the older hardware will stop working.
Once the phone makers realize that they can sell phones and docking stations to businesses because 90% of knowledge work seems to happen in a web browser through one SaaS or other I think personal computers will be cooked.
Also pulling and shredding hard drives is cheaper than paying someone to run DBAN or equivalent (which can take many hours to complete on a large drive), and there's no easy way to securely erase an SSD if it wasn't encrypted from the beginning.
I probably will only need to return newest laptop if I leave the company.
I don't know if TSMC has anything to do with hard drive production, but the reliance on very few players is also a problem in that industry.
Indeed, investors left to their own devices act in this way. Underlying such a single point-of-failure is an implied but immense hope and thus pressure for stability. I wonder what the prediction markets saying about current levels of geopolitical stability in Taiwan?
Interesting. Capitalism is often touted to be more decentralized than socialism, but this is an example of how it can centralize.
Isn't this just taking the oft-proposed explore vs exploit dichotomy to the logical conclusion of the "exploit" side?
Every single arbitrarity-finely-divided thing "should" be handled by the single (group|process) that has the greatest relative advantage at that one thing.
And you end up with the total variety/detailedness of everything matching what the substrate of the economy (ie, people with specialized training or education) has capacity to support. So at the limit there is at most one person who knows how to do any one specific thing.
(And the global economic system becomes infinitely fragile, but eh who's counting.)
Turns out letting a bunch of MBAs plan your economy is extremely foolish.
And isn't it also in a seismically active region, also prone to eathquake and/or tsunami?
Stop getting your news from news.
> that has all of their important factories on a single island that's under constant threat of invasion.
Threat of invasion? Who would dare invade taiwan when it's protected by china?
> I don't know if TSMC has anything to do with hard drive production
Then why bother commenting here?
> but the reliance on very few players is also a problem in that industry.
Ah, you have a political agenda.
Is the profitability of these electronics manufacturers more likely than the companies that are buying up all their future inventory?
If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.
If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.
And if they produce lot of video, they might keep copies around.
I was toying with getting a 2T HDD for a BSD system I have, I guess not now :)
https://www.washingtonpost.com/technology/2026/01/27/anthrop...
As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.
Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.
If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website
Modern society seems to assume any work by a person is due to that person alone, and credits that person only. But we know that is not the case. Any work by an author is the culmination of a series of contributions, perhaps not to the work directly, but often to the author, giving them the proper background and environment to do the work. The author is simply one that built upon the aggregate knowledge in the world and added a small bit of their own ideas.
I think it is bad taste to pass another's work as your own, and I believe people should be economically compensated for creating art and generating ideas, but I do not believe people are entitled to claim any "ownership" of ideas. IMHO, it is grossly egoistic.
The ones who spend billions on integrating public cloud LLM services are not the ones writing that function. They are managers who based on data pulled out of thin air say "your goal for this year is to increase productivity by X%. With AI, while staffing is going slightly down".
I have to watch AI generated avatars on the most boring topics imaginable, because the only "documentation" and link to actual answer is in a form of fake person talking. And this is encouraged!
Then the only measure of success is either AI services adoption (team count), or sales data.
That is the real tragedy and the real scale - big companies pushing (external!) AI services without even proof that it justifies the cost alone. Smooth talking around any other metric (or the lack of it).
So I'm getting tired of the argument that LLMs are "plagiarism machines" -- yes, they can be coaxed into repeating training material verbatim, but no, they don't do that unless you try.
Opus 4.6's C compiler? I've not looked at it, but I would bet it does not resemble GCC -- maybe some corners, but overall it must be new, and if the prompting was specific enough as to architecture and design then it might not resemble GCC or any other C compiler much at all.
Not only do LLMs mimic human thinking, but also they mimic human faults. Obviously one way in which they mimic human faults is that there are mistakes in the LLMs' training materials, so they will evince some imperfections, and even contradictions (since there will be contradictions in their training materials). Another way is that their context windows are limited, just like ours. I liken their hallucinations to crappy code written by a tired human at 3AM after a 20 hour day.
If they are so human-like, we really cannot ascribe their output to plagiarism except when prompted so as to plagiarize.
6-7 years ago when GPU prices went up, I hoped nothing would break. Last year when RAM prices went up I did the same. Now with drive prices going up, it's the same thing.
It's interesting because I've always built mid-tier machines over the years and it was in the neighborhood of ~$700 at the time. Now the same thing is almost double that but the performance is no where near twice as good for general computer usage.
We're fucking doomed.
It’s building materials being in short supply when there’s obviously more houses than buyers. That’s just masked at the moment because of all the capital being pumped in to cover for the lack of actual revenue to pay for everything. The structural mismatch at the moment is gigantic, and the markets are getting increasingly impatient waiting for the revenue to materialize.
Mark this post… in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market after folks pick up the pieces of imploded AI companies.
(For the record, I’m a huge fan of AI, but that doesn’t mean I don’t also think a giant business and financial bubble is about to implode).
COVID was six years ago. In that time, GPU prices haven't gone down (and really have only increased). Count me skeptical that there will be a flood of cheap components.
But there was a window as recently as fall (3-5 months ago) where you could get most PC parts at MSRP. Granted it was a pretty short window, before the last dying whispers of crypto and COVID induced scarcity were overtaken by the surge of the AI bubble.
> It’s building materials being in short supply when there’s obviously more houses than buyers.
That I think is a hard one to prove and is where folks are figuring it out. There is obvious continued demand and certainly a portion of it is from other startups spending money. I don’t think it’s obvious though where we are at.
One can only hope that that's the principle at work here, anyway. It could also be a critically damped system for all I know. Unfortunately I studied control systems too...
On other hand if there is bigger economic turmoil that might mean that the postponed demand does not realise as there is no purchasing power...
I would love if more non-traditional economists got involved in the public sphere by which I mean: writing about economic trends, public policy, regulation, rate-adjustment, etc.
Market says the more you buy the better pricing you get. Once you start capturing large market share of the product, the price should go up and not down; exponentially.
Example, a person that owns 10 houses means that they are restricting the ability of others to own a single home. By increasing the cost of excessive product ownership ... it will reduce the amount of product that people will hoard and allow others to gain access to it.
If you try to use government to force reality to conform to your idea of how things should work you're just going to get 1,000 companies buying 10,000 hard disks each rather than 10 companies buying 1,000,000 each. And if you try to outlaw that somehow then the market will just route around your new scheme in another way, creating even more unintended consequences in the process.
If you must meddle, you're much better off working with market forces rather than trying to fight against them.
Unfortunately, there are several such outlier entities which collectively control enough resources to price literally everyone else out.
I buy one hard disk, AI company busy 40% of global supply. Me not buying that one hard disk is not going to change anything.
>According to Western Digital, thanks to a surge in demand from its enterprise customers, the consumer market now accounts for just 5 percent of the company's revenue.
On the otherhand lots of people here are even more uncomfortable of the other option, which is quite possible: AI software algorithms may scale better than the capacity of companies that make the hardware. Personally I think hardware is the harder to scale from the two and this is just the beginning.
The replacement arrived also in a paper bag and went straight back, this time for a refund.
I guess I should have kept that one and hoped for the best.
Good alternatives? I’ve only recently been enlightened on how profoundly sh__ty SSD is for long-term storage and I have a whole lot of images my parents took traveling the last few years of their lives.
And I’m not keen on having anyone ship me one of these anymore.
Walmart sells what appears to be an older version of the drive and I might have to cross my fingers and just get one of those.
Isn't that what you're doing ordering off amazon with their comingled inventory?
Besides, there's a spectrum of sellers between "Amazon" and "anybody", you can even, perhaps, purchase directly from the manufacturer.
How does external compare to internal, if at all? Is 3.5" going to last longer than something smaller?
The solution is just a lot of redundancy for larger disk arrays whenever practical. I currently have a 15x1TB 7200 RPM zpool in raidz2 I use for "scratch space" for some automation projects. It writes about 500GB-1TB or so a day and has for... over 18 years. I have had exactly one drive fail from that pool, under heavy abuse. That one failed a year or two in. Prior to my personal use it was beat on (mostly reads) as backing storage for uploaded images for a large website where the drives operated at 90% or higher I/O utilization pretty much 24x7.
I have other pools of disks where I have replaced over 50% of them 6 years in, with batches of failures seemingly at random. You start to notice patterns with various drive models - but not until well after the point of purchase where it's far too late to predict based on anything like vendor reputation or whatnot. I've had batches of various WD, Seagate, Toshiba, and HGST all both be incredibly reliable and some incredibly not so. Some of the same model series just different drive sizes have wildly different reliability characteristics.
I don't bother pulling "old" drives out of production preemptively any more. The only thing I do preemptively now is pull drives with very critical SMART prefailure warnings such as a consistently growing number of unrecoverable sector errors. That one and a couple other attributes are worth watching trends for, but the rest are pretty pointless and really do not seem to correlate much. And again, it varies by drive model for which times to pay attention to a particular SMART attribute and which not to.
I simply treat drives as wear items that fail with little to no notice, and just make sure I can survive a number of simultaneous failures at once. Make sure to regularly test your monitoring!
Not power cycling drives is huge as well, as you note. For example these old 1TB spinners:
9 Power_On_Hours 0x0012 078 078 000 Old_age Always - 158328
12 Power_Cycle_Count 0x0032 100 100 000 Old_age Always - 13I'd have thought HDDs aren't at the top of the list for AI requirements, are other component manufacturers struggling even more to meet demand?
If we weren’t talking about AI, was there another high demand sector / customer for spinning platters?
And their margins get fat now that supply is relatively constant but AI demand has saturated their current production numbers.
thanks, AI-boosting motherfuckers, thanks a lot
So they’re unwilling to spend on increasing capacity because they don’t expect this demand to last.
Yes, AI is nice, but I also like to be able to buy some RAM and drives…
There is one exception though. Open WebUI with a whopping 960 MB. It's literally a ChatGPT interface. I'm only using external API providers. No local models running.
Meanwhile my website that runs via my own Wordpress-like software written in Rust [1] requires only a few MB of RAM, so it's possible.
More likely a couple of big financing wobbles lead to a fire sale.
It isn't practical for HDD supply to be wedged because in 5 years the disks start failing.
The main reason I do not prioritise AI usage in my own life is to retain my skills and mental acuity. All of the forms of computing and opportunities that I value do not require AI to achieve. I can understand why people feel differently from me, though, because AI and AI-adjacent things are where all of the money is right now.
Well, at least they might still have a product to sell once the AI bubble pops, unlike with NVIDIA which does seem to kinda forgot to design new consumer GPUs after getting high on AI money.
That's either a typo, or NVidia has achieved some previously unheard of levels of innovation.
Spoiler from the future: it hasn't. Get your investments in while you have time.
I love modern world so much /s