28 Comments

  1. I work in hydroelectricity

    That’s all good for my future

    But honestly seeing people ask AI for so much stuff a regular search can do is worrisome for the capacity to provide all this power.

  2. Mfw the entire country of Sweden’s peak electrical capacity is like half of what these AI companies want to run their AI waifus on.

  3. At least we don’t have to wait a long time for the Earth to become uninhabitable. But at least people will be able to make funny images for a few years.

  4. Consider that this is to be funded in part with $1.5 trillion pledged by openAI but they dont even crack $0.1 trillion in revenue. In fact they are closer to $0.03 trillion.

  5. Context – The United States currently has an average output of ~**466,210 MW.**

    Therefore, they are looking for an increased output of ~7%. Or functionally adding the equivalent of 23,000,000 people to our population, for energy demands.

    This could be about 80-175 new power plants, based on size and fuel type. (12,538 power plants currently)

    Context matters, and its easy to deceive even with hard numerical values, so I tried to do a mix of perspectives of “that sounds like a lot” and “that doesn’t sound like much.”

    Edit: Clarity

  6. Can somebody explain why we’re measuring datacenter capacity in MW instead of FLOPs or another unit of compute power?

  7. >The data was primarily collected from machine learning papers, publicly available news articles, press releases, and existing lists of supercomputers.

    >We created a list of potential supercomputers by using the Google Search API to search key terms like “AI supercomputer” and “GPU cluster” from 2019 to 2025, then used GPT-4o to extract any supercomputers mentioned in the resulting articles. We also added supercomputers from publicly available lists such as Top500 and MLPerf, and GPU rental marketplaces. For each potential cluster, we manually searched for public information such as number and type of chips used, when it was first operational, reported performance, owner, and location.

    Love that they consider this to be ‘data’

  8. overzealous_dentist on

    Can we stop calling things oligarchies when they are obviously not remotely related to oligarchies?

  9. this is genuinely going to be our demise, no way all of that electricity demand will be met with green energy

  10. This “data” is utter bullshit. Microsoft alone had reportedly over 600.000 H100 (equivalent) GPUs last year, where XAI barely hits half that today.

  11. LanchestersLaw on

    Is Google, one of the forerunners, planning to be 10th or do their actions say a plan for 1GW of GPUs is stupid and marketing fluff?

  12. Some of these companies plan on building their own nuclear power plants.  
      
    Nuclear Stonks? 

  13. If only they all didn’t directly help reelect the guy that immediately cancelled all those renewable energy infrastructure update projects?

  14. Mad that artificial needs several nuclear reactors worth of energy, while normal intelligence can run off a snickers and some cocaine.

  15. Strong-Chair3017 on

    As someone sitting in central Ohio, working at a hyperscaler… We single buildings consuming more than what’s listed in the existing capacity section here.
    This is woefully incorrect information.

    (I will concede that some of it may just not be public)

  16. Answerisequal42 on

    This is the best argument for NPPs i have seen in a while.

    The planned power requirements are batshit insane.

  17. This is wildly inaccurate as far as operational numbers. Google for example has a lot more than 80MW of their TPUs doing AI training and inference already.

  18. Wow Google only uses ~80 MW and can still dispatch Gemini + Claude workloads without being overloaded (relative to OpenAI?). That’s nice. Or that’s… underreporting