The last time capital flowed at this velocity into a single infrastructure category was the early 2000s buildout of broadband and fiber. We all know how that ended. But the comparison to AI infrastructure today misses something important: the demand signal is real, and it’s accelerating.
Hyperscalers are committing to multi-year, multi-hundred-billion-dollar capex programs. Sovereign wealth funds are underwriting data center campuses. The U.S. government is treating compute as a strategic national asset. This is no longer a venture story — it’s an infrastructure story.
In the 20th century, access to oil defined geopolitical and economic power. Nations without domestic production were structurally dependent. Nations with it could project influence, fund governments, and shape markets.
Compute is beginning to work the same way. The ability to train and run frontier AI models requires massive investments in power, cooling, land, chips, and interconnect — inputs that are not evenly distributed.
“The nation that leads in AI will be the nation that controls its own compute stack. This is not a technology question — it’s a sovereignty question.”
Most equity analysts are modeling AI infrastructure as a cyclical capex story: spending goes up, then normalizes, then the companies that built capacity have to justify returns. This framing is wrong.
The better analogy is toll roads. Once the infrastructure exists and customers are on it, switching costs are enormous. The question is not whether returns normalize — it’s how long the moat holds.
| Layer | Incumbent | Moat | Risk |
|---|---|---|---|
| Chips | NVIDIA | CUDA ecosystem, supply chain | AMD, custom silicon |
| Cloud | AWS, Azure, GCP | Scale, software integration | Sovereign clouds, CoreWeave |
| Power | Utilities | Regulated monopoly | Nuclear, distributed gen |
| Data Centers | Equinix, Digital Realty | Location, connectivity | Hyperscaler build-own |
Three things will determine whether this infrastructure cycle ends differently than the fiber buildout:
Power availability. The single most binding constraint is not chips or land — it’s the electrical grid. Every major hyperscaler is now a de facto energy company. The ones that solve power first will win.
Model efficiency curves. If inference gets dramatically cheaper through distillation and quantization, the demand for raw compute could plateau. This is the bear case the bulls are ignoring.
Geopolitical fragmentation. Export controls on advanced chips are already reshaping where AI infrastructure gets built. This bifurcation will accelerate — and it creates asymmetric opportunities for investors who understand both sides of the divide.
The infrastructure supercycle is real. The question is who captures the value — and at what point in the stack. I’ll be writing more about each of these layers over the coming months.