> The artificial intelligence boom is driving a surge in electricity demand across the United States, as data centers powering AI tools like large language models (LLMs) and image generators **consume massive amounts of energy.** That demand is pushing up household power bills and **straining the country’s electric grid**, leaving millions of Americans footing the bill.
> This summer, electricity bills surged across the eastern United States. In Trenton, New Jersey, the average home’s monthly bill rose by $26. In Columbus, Ohio, it climbed $27, **driven largely by rising costs in the region’s wholesale power markets.**
> The latest data from the U.S. Energy Information Administration backs that perception. In May 2025, the average U.S. household paid 17.47 cents per kilowatt-hour, up from 16.41 cents a year earlier—a 6.5 percent increase. Some states saw much sharper spikes, including Maine (**up 36.3 percent**) and Connecticut (**18.4 percent**).
> “These are not your average server farms,” said Abraham Silverman, an energy researcher at Johns Hopkins University. “AI training centers—hyperscale facilities—can draw hundreds or thousands of megawatts at a single site. **It’s like building five nuclear plants into the grid every year, just for AI.”**
> Even as utilities invest in infrastructure, many large-scale data center projects don’t cover the full cost of the substations and transmission lines needed to serve them. A report by the analytics firm Wood Mackenzie found that in most cases, **utilities end up shifting these costs onto other customers** or absorbing them entirely. “**Utilities either need to socialize the cost to other ratepayers or absorb that cost**—essentially, their shareholders would take the hit,” said Ben Hertz-Shargel
pdxaroo on
I’m just going to point out this isn’t an AI problem, it’s officials not properly regulating and pricing industry.
These data centers need to pay more, not less.
sevseg_decoder on
Meanwhile people have pages long, hours long conversations with the AI and get pissed it lost some of its “personality.”
I don’t think there’s a scenario where AI is a whole ton cheaper than the employees it “replaces,” especially not factoring in the technical debts that will come due in the long run. Meanwhile we’re just speeding up our pollution of the environment and burning through finite water supplies so people can have the AI write a “script for their life as a TV show” to share with their friends lol.
nullv on
More coal will surely fix the problem. Perhapse we could also use steam engines to power the computers.
Beneficial_Soup3699 on
Corporate socialism chugging along just fine in America, as always. It’s a shame these poor billionaire AI company owners can’t stop building their doomsday bunkers long enough to pull themselves up by their bootstraps 🙁
ElMerroMerr0 on
Seems like it’s bi annually now that I get a notice from my power company, letting me know they’re requesting a rate increase.
6 Comments
> The artificial intelligence boom is driving a surge in electricity demand across the United States, as data centers powering AI tools like large language models (LLMs) and image generators **consume massive amounts of energy.** That demand is pushing up household power bills and **straining the country’s electric grid**, leaving millions of Americans footing the bill.
> This summer, electricity bills surged across the eastern United States. In Trenton, New Jersey, the average home’s monthly bill rose by $26. In Columbus, Ohio, it climbed $27, **driven largely by rising costs in the region’s wholesale power markets.**
> The latest data from the U.S. Energy Information Administration backs that perception. In May 2025, the average U.S. household paid 17.47 cents per kilowatt-hour, up from 16.41 cents a year earlier—a 6.5 percent increase. Some states saw much sharper spikes, including Maine (**up 36.3 percent**) and Connecticut (**18.4 percent**).
> “These are not your average server farms,” said Abraham Silverman, an energy researcher at Johns Hopkins University. “AI training centers—hyperscale facilities—can draw hundreds or thousands of megawatts at a single site. **It’s like building five nuclear plants into the grid every year, just for AI.”**
> Even as utilities invest in infrastructure, many large-scale data center projects don’t cover the full cost of the substations and transmission lines needed to serve them. A report by the analytics firm Wood Mackenzie found that in most cases, **utilities end up shifting these costs onto other customers** or absorbing them entirely. “**Utilities either need to socialize the cost to other ratepayers or absorb that cost**—essentially, their shareholders would take the hit,” said Ben Hertz-Shargel
I’m just going to point out this isn’t an AI problem, it’s officials not properly regulating and pricing industry.
These data centers need to pay more, not less.
Meanwhile people have pages long, hours long conversations with the AI and get pissed it lost some of its “personality.”
I don’t think there’s a scenario where AI is a whole ton cheaper than the employees it “replaces,” especially not factoring in the technical debts that will come due in the long run. Meanwhile we’re just speeding up our pollution of the environment and burning through finite water supplies so people can have the AI write a “script for their life as a TV show” to share with their friends lol.
More coal will surely fix the problem. Perhapse we could also use steam engines to power the computers.
Corporate socialism chugging along just fine in America, as always. It’s a shame these poor billionaire AI company owners can’t stop building their doomsday bunkers long enough to pull themselves up by their bootstraps 🙁
Seems like it’s bi annually now that I get a notice from my power company, letting me know they’re requesting a rate increase.