Share.

13 Comments

  1. “For decades, Moore’s Law, coined by Intel co-founder Gordon Moore in 1965, has been the driving force behind computing progress. It predicted that the number of transistors on computer chips would roughly double every year, leading to exponential growth in performance and plummeting costs. However, this law has shown signs of slowing down in recent years.

    Huang, however, painted a different picture of Nvidia’s AI chips. “Our systems are progressing way faster than Moore’s Law,” he told TechCrunch, pointing to the company’s latest data center superchip, which is claimed to be more than 30 times faster for AI inference workloads than its predecessor.

    Huang claimed that Nvidia’s AI chips today are 1,000 times more advanced than what the company produced a decade ago, far outstripping the pace set by Moore’s Law.

    Rejecting the notion that AI progress is stalling, Huang outlined three active AI scaling laws: pre-training, post-training, and test-time compute. He pointed to the importance of test-time compute, which occurs during the inference phase and allows AI models more time to “think” after each question.”

  2. Considering “Moore’s Law” had to do with transistor count….he is sort of mixing apples and oranges. Just like a showman

  3. AI doesnt correlate with Moore’s law. In future he will claim that an AI chip without any “raw” power will be moore’s laws grandpapi.

  4. The CES presentation was a bunch of fluff, misleading charts and AI hype. I’m taking anything this dude, Sam Altman, or anybody who has it in their best interests of generating false hype with a large chunk of salt.

  5. Moore’s Law absolutely implies nothing about computer performance.

    Imagine someone says “computer performance doubles every 2 years”. What applications/workloads are we talking about? It absolutely depends on what you’re benchmarking.

  6. I’m firmly in the “frame-gen = fake frames” camp. I don’t care how much AI they use in the render process, response times of an AI-enhanced output will never equal response times of a raw, non-enhanced output. Gamers have spent years chasing high frame-rates and low response times. They aren’t going to be happy with 60fps response times just because they’re seeing 240 fps on-screen.

  7. The Nvidia GPU I’ll buy twenty years from now is already generating frames on the games I play today

  8. farticustheelder on

    This is fun. There is a sense in that Moore’s Law is just a local portion of Wright’s Law, and maybe Huang’s Las is the next named section coming up. Or not. Nvidia’s chip level improvement to date may be all the available low hanging optimizing fruit with diminishing returns to come.

  9. andrew_kirfman on

    “Problem domain that we hadn’t optimized for a decade ago is now highly optimized with lots of continuing investment funding”

    More News at 11

  10. Moore’s law has nothing to do with performance and everything to do with transistor density. What a tool.