> What I'm saying is that using gigawatts of power for "AI" in this day and age is madness
Why? American datacentres--of all types--use about 250 TWh per year, with another 500 TWh additional capacity expected by 2030 [1]. American paper manufacturing used about that much energy in 2018 [2].
If I read the data right (1) the US currently produces roughly 4,000 TWh of electricity every year. 500 TWh is a significant portion of that! The US will need a lot of additional capacity for things like electric cars and heat pumps. Most of the effort should be going towards that, not huge data centers attending to unproven demand (how many people will pay the real price for ChatGPT once the VC subsidy ends remains to be proven).
The sources for some of the future data centers will be local and not necessarily influence the US grid. Consider also that cement production uses about 3000 TWh per year worldwide, and aluminium smelting uses about 1000 TWh per year.