This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
https://www.utilitydive.com/news/pjm-interconnection-capacit...
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
They don’t generally just have GW of power sitting idle for a rainy day (I’m not talking about the capacity they reserve for hot july days).
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
https://www.nature.com/articles/s41598-024-76682-6
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans...
Saying “we can do the same work with less resource use” doesn’t mean resource consumption is reduced. You’ve just gone from humans using resources to humans using the same resources and doing less work, plus AI using more resources.
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
See, the AI is gonna create jobs, not eliminate them lol. Now let us strip mine your hood G.
How does paying more monthly cover an infrastructure build out that requires up front capital?