There is another way forward, which is not building these data centers, forcing AI companies to use power more efficiently, and use the excess energy production capacity towards the energy transition in order to avoid the worst consequences of climate change.
It's not going to happen, at least not right now, but it's clearly what we ought to do. ChatGPT can wait.
How? Also, why? Why are datacentres the use to tamp down on versus other industrial and commercial uses?
This reminds me of California rationing residential water use so alfalfa farmers can flood their fields.
I do like the market insulation idea you propose in another comment (I would link to it, but apparently HN doesn't allow that).
Why? American datacentres--of all types--use about 250 TWh per year, with another 500 TWh additional capacity expected by 2030 [1]. American paper manufacturing used about that much energy in 2018 [2].
[1] https://www.iea.org/reports/energy-and-ai/energy-demand-from...
[2] https://www.eia.gov/energyexplained/use-of-energy/industry.p... 2,491tn BTUs ~ 730 TWh
This is not a theoretical concern, it is happening already.
It is, however, complete nonsense, and the next few years of failed promises on AGI will eventually bring people to their senses, if a market crash and sustained economic depression doesn't do that first. It would be funny if it wasn't going to cause suffering for millions of people, whether we succeed at AGI or not.
I _like_ AI, I find LLMs and many other aspects of useful, and I am optimistic for the long term prospects of AI. But the rush to try and get to AGI is completely out of control at this point, and the fallout from when the bubble pops will set AI, and our societies, back a long time.
I'm all for more efficient usage, and it's in AI companies best interest to do so to minimize costs.
...but it's a growing industry, it will need more power.
Completing the energy transition is an enormous undertaking. Building huge data centers is a distraction, not a way forward.