XAI probably running most of that off temporary, unregulated, diesel generators…
Mazzle5 on
Good luck with that run down electrical system in the US
thrBeachBoy on
I work in hydroelectricity
That’s all good for my future
But honestly seeing people ask AI for so much stuff a regular search can do is worrisome for the capacity to provide all this power.
nnrain on
Mfw the entire country of Sweden’s peak electrical capacity is like half of what these AI companies want to run their AI waifus on.
m4rk0358 on
At least we don’t have to wait a long time for the Earth to become uninhabitable. But at least people will be able to make funny images for a few years.
TophatOwl_ on
Consider that this is to be funded in part with $1.5 trillion pledged by openAI but they dont even crack $0.1 trillion in revenue. In fact they are closer to $0.03 trillion.
bzzard on
You want some power for your fridge? Thats so selfish! This power can go to power AI!
tegresaomos on
So like 12 nuclear power plants?
And who is going to build this?
Halfwise2 on
Context – The United States currently has an average output of ~**466,210 MW.**
Therefore, they are looking for an increased output of ~7%. Or functionally adding the equivalent of 23,000,000 people to our population, for energy demands.
This could be about 80-175 new power plants, based on size and fuel type. (12,538 power plants currently)
Context matters, and its easy to deceive even with hard numerical values, so I tried to do a mix of perspectives of “that sounds like a lot” and “that doesn’t sound like much.”
Edit: Clarity
kelvindevogel on
Can somebody explain why we’re measuring datacenter capacity in MW instead of FLOPs or another unit of compute power?
optionr_ENL on
>The data was primarily collected from machine learning papers, publicly available news articles, press releases, and existing lists of supercomputers.
>We created a list of potential supercomputers by using the Google Search API to search key terms like “AI supercomputer” and “GPU cluster” from 2019 to 2025, then used GPT-4o to extract any supercomputers mentioned in the resulting articles. We also added supercomputers from publicly available lists such as Top500 and MLPerf, and GPU rental marketplaces. For each potential cluster, we manually searched for public information such as number and type of chips used, when it was first operational, reported performance, owner, and location.
Love that they consider this to be ‘data’
bdkoskbeudbehd on
Somebody please pop this AI bubble already
Gandalfthebran on
Why does META need so much more power?
overzealous_dentist on
Can we stop calling things oligarchies when they are obviously not remotely related to oligarchies?
Shliopanec on
this is genuinely going to be our demise, no way all of that electricity demand will be met with green energy
Tar_alcaran on
This “data” is utter bullshit. Microsoft alone had reportedly over 600.000 H100 (equivalent) GPUs last year, where XAI barely hits half that today.
LanchestersLaw on
Is Google, one of the forerunners, planning to be 10th or do their actions say a plan for 1GW of GPUs is stupid and marketing fluff?
everlasting1der on
Deeply unserious industry
savage011 on
Some of these companies plan on building their own nuclear power plants.
Nuclear Stonks?
Sea-Sir2754 on
If only they all didn’t directly help reelect the guy that immediately cancelled all those renewable energy infrastructure update projects?
UrbanPlannerholic on
And Trump wants to power it all with coal
Glockass on
Mad that artificial needs several nuclear reactors worth of energy, while normal intelligence can run off a snickers and some cocaine.
Strong-Chair3017 on
As someone sitting in central Ohio, working at a hyperscaler… We single buildings consuming more than what’s listed in the existing capacity section here.
This is woefully incorrect information.
(I will concede that some of it may just not be public)
Independent-Bug-9352 on
I, uh, hope we get fusion soon.
Answerisequal42 on
This is the best argument for NPPs i have seen in a while.
The planned power requirements are batshit insane.
Sporkers on
This is wildly inaccurate as far as operational numbers. Google for example has a lot more than 80MW of their TPUs doing AI training and inference already.
asyhler on
But don’t forget to turn off the lights when leaving the room!
needefsfolder on
Wow Google only uses ~80 MW and can still dispatch Gemini + Claude workloads without being overloaded (relative to OpenAI?). That’s nice. Or that’s… underreporting
28 Comments
XAI probably running most of that off temporary, unregulated, diesel generators…
Good luck with that run down electrical system in the US
I work in hydroelectricity
That’s all good for my future
But honestly seeing people ask AI for so much stuff a regular search can do is worrisome for the capacity to provide all this power.
Mfw the entire country of Sweden’s peak electrical capacity is like half of what these AI companies want to run their AI waifus on.
At least we don’t have to wait a long time for the Earth to become uninhabitable. But at least people will be able to make funny images for a few years.
Consider that this is to be funded in part with $1.5 trillion pledged by openAI but they dont even crack $0.1 trillion in revenue. In fact they are closer to $0.03 trillion.
You want some power for your fridge? Thats so selfish! This power can go to power AI!
So like 12 nuclear power plants?
And who is going to build this?
Context – The United States currently has an average output of ~**466,210 MW.**
Therefore, they are looking for an increased output of ~7%. Or functionally adding the equivalent of 23,000,000 people to our population, for energy demands.
This could be about 80-175 new power plants, based on size and fuel type. (12,538 power plants currently)
Context matters, and its easy to deceive even with hard numerical values, so I tried to do a mix of perspectives of “that sounds like a lot” and “that doesn’t sound like much.”
Edit: Clarity
Can somebody explain why we’re measuring datacenter capacity in MW instead of FLOPs or another unit of compute power?
>The data was primarily collected from machine learning papers, publicly available news articles, press releases, and existing lists of supercomputers.
>We created a list of potential supercomputers by using the Google Search API to search key terms like “AI supercomputer” and “GPU cluster” from 2019 to 2025, then used GPT-4o to extract any supercomputers mentioned in the resulting articles. We also added supercomputers from publicly available lists such as Top500 and MLPerf, and GPU rental marketplaces. For each potential cluster, we manually searched for public information such as number and type of chips used, when it was first operational, reported performance, owner, and location.
Love that they consider this to be ‘data’
Somebody please pop this AI bubble already
Why does META need so much more power?
Can we stop calling things oligarchies when they are obviously not remotely related to oligarchies?
this is genuinely going to be our demise, no way all of that electricity demand will be met with green energy
This “data” is utter bullshit. Microsoft alone had reportedly over 600.000 H100 (equivalent) GPUs last year, where XAI barely hits half that today.
Is Google, one of the forerunners, planning to be 10th or do their actions say a plan for 1GW of GPUs is stupid and marketing fluff?
Deeply unserious industry
Some of these companies plan on building their own nuclear power plants.
Nuclear Stonks?
If only they all didn’t directly help reelect the guy that immediately cancelled all those renewable energy infrastructure update projects?
And Trump wants to power it all with coal
Mad that artificial needs several nuclear reactors worth of energy, while normal intelligence can run off a snickers and some cocaine.
As someone sitting in central Ohio, working at a hyperscaler… We single buildings consuming more than what’s listed in the existing capacity section here.
This is woefully incorrect information.
(I will concede that some of it may just not be public)
I, uh, hope we get fusion soon.
This is the best argument for NPPs i have seen in a while.
The planned power requirements are batshit insane.
This is wildly inaccurate as far as operational numbers. Google for example has a lot more than 80MW of their TPUs doing AI training and inference already.
But don’t forget to turn off the lights when leaving the room!
Wow Google only uses ~80 MW and can still dispatch Gemini + Claude workloads without being overloaded (relative to OpenAI?). That’s nice. Or that’s… underreporting