During the Dot Com bubble, billions were poured into fiber optic infrastructure that was widely considered financial overreach. 90% of it went dark. But that same infrastructure became the physical foundation that YouTube, Netflix and Facebook were built on 20 years later. A full breakdown here: https://youtu.be/_NDAUTyRxqY

Today's AI data center and GPU buildout follows a strikingly similar pattern – massive capital expenditure that current adoption rates struggle to justify, concentrated in a handful of companies running a circular cash flow. The question worth asking isn't whether the bubble deflates, but whether the infrastructure being laid today plays the same long game the fiber did.

The AI infrastructure buildout mirrors the Dot Com fiber optic boom – and history suggests the long term story might matter more than the short term financial one (link in the comments)
byu/OkHeat6599 inFuturology

17 Comments

  1. The full data-driven breakdown of the structural parallels between the two cycles – the circular cash flows, the infrastructure legacy, and what the most likely outcome is, here: [https://youtu.be/_NDAUTyRxqY](https://youtu.be/_NDAUTyRxqY)

  2. TooMuchTaurine on

    Thing is, fibre has close to infinite life.  You can just keep upgrading the endpoints. The GPU on all the install will go EOL in 3-4years.

  3. The bigger issue I would foresee is that fiber optic was more future proof than the hardware that composes the server infrastructure. The data drives, RAM, GPUs, and CPUs that are currently used today will become outdated quicker than fiber optic in a 20 year timeline. The AI data centers will constantly need to improve hardware to keep up with the technology to avoid becoming the constant bottleneck.

  4. Thin-Theory-4805 on

    I will wait for the short-term deep correction and see which players survive. Investing at this time is frankly stupid.

  5. projectschema on

    Very interesting topic. I would say part of the infrastructure built today will be used for other usages in the future. Probably because AI will change over time, normally in a more efficient infrastructure, so the rest will power other digital industries also

  6. It does not. GPU are essentially raw materials consumed in the production process. Fiber optic cables are actually true infrastructure that lasts a long time after installation.

  7. Flutterpiewow on

    Same thing with electricity, printing press and railroads, it’s just how it works.

  8. I like the premise, though struggle to think of the equivalent. People mention the actual hardware reached EOL stupid fast.
    My guess is that soon ish the search for AGI will halt.

    Fiction interlude:
    We Are Legion (We Are Bob) is a great piece of fiction that touches on that idea: AI is far more difficult to make than the science fiction of the previous century make ys believe. The book names AMIs (Automated machine intelligence), smart enough to do programmable tasks but ultimately unable to do anything that specifically wasn’t accounted for by its creators. Replicant AI in the books refers to the same concept as the Matrix or Upload: digital human consciousness. Though in the case of the book these AIs can tweak their perception of time and access to digital features to behave more like a human operating a computer at computer speeds. Humanity never discovers the bridge towards actual digital consciousness in the books.

    I’m pretty convinced that this will be true in real life as well, at least for this half of the century. All the work and research going into it will ultimately lead to very useful machines but with no major leap towards AGI or ASI. At some point this will sink in and the focus will shift, just like with personal computers and the Internet, to broad usability and affordability training AI models as a service but sold as licences to run locally on consumer devices. Not that this last step is certain, but to bring it back to the original question, my guess is that the infrastructure equivalent is the methods being developed for training better models, and the the shift we might see in the future will revolve around the focus of scaling down to usefulness at a human/ consumer scale.

  9. Fiber were laid and most were not used until years later, thus the term Dark Fiber. GPUs from 6 years ago are still 100% utilized. They are not mirrored. Not even close.

  10. Infinite-Jelly-3182 on

    This is the sort of thing that looks right on the surface but falls apart under deep analysis of either

  11. theallsearchingeye on

    The entire premise of your point is that the infrastructure isn’t wasted, so I don’t think there’s a question that the billions spent on data centers, software, and consumer appeal will “play the same long game”; the infrastructure will get used and continue to expand.

    Where this scenario is wildly different is the scale of growth and development, the AI models are developing in scope and power at an exponential rate and we can’t even keep up with the inherent infrastructure need. The data centers we are building today won’t even be able to contain the models of 2 years from now, it’s why the hyperscaling is so necessary.

    Another thing, The dotcom bubble was signaled by a lack of utilization of the investments in infrastructure, not purely a lack of ROI (which was only because of said lack of utilization). Any comparisons with the dotcom bubble are juvenile because the only criteria is “all this explosive growth must actually be useless right?” When all of the signs with the technology couldn’t be farther from the truth.

  12. Ah, so we’re past the point of denial and into the bargaining stage. That’s… progress I guess.

  13. ultrathink-art on

    Fiber was architecture-neutral — Netflix, YouTube, TikTok all ran on the same cables without modification. Today’s GPU clusters are optimized specifically for transformer-style dense matrix ops, so if the dominant architecture shifts, the hardware won’t ‘go dark temporarily’ the way fiber did — it may be genuinely obsolete.

  14. AttitudeGlass64 on

    the fiber analogy is interesting but the key difference is where the stranded asset sits. in the dot-com build-out, the fiber stayed useful even after the companies that laid it went bankrupt — the infrastructure outlasted the speculation. with AI compute the question is whether GPUs have the same staying power, or whether they become obsolete faster than the fiber did. if the next generation of models requires substantially different hardware, the current buildout is less like fiber and more like purpose-built equipment that gets stranded. the bull case is general-purpose GPU clusters retain value across model generations. the bear case is custom silicon dominates and current infrastructure loses its relevance faster than expected.

  15. CheifJokeExplainer on

    Except, the fiber that was laid didn’t become obsolete. We still use that kind of fiber, perhaps with improved transceivers, but the same physical medium. I’m not sure that we’ll be true of computers in 20 years. I guess the buildings will still be much the same.