“Energy is the only universal currency; one of its many forms must be transformed to get anything done.” –Vaclav Smil.
For years, we thought AI was software—an abstraction floating above the physical world. Write the code, design better models, fabricate more advanced processors, and watch the transformation unfold. The capital was there. The talent was there. The so-called Magnificent Seven are spending on an epic scale to build it. Genius and money, however, are not enough. The constraint is power.
The scale of that constraint is no longer theoretical. Reuters recently reported that “some of the largest sites require over a gigawatt of continuous power, equivalent to the consumption of hundreds of thousands of homes”. A single AI campus can now draw as much electricity as a mid-sized city. In grid-planning terms, that is the output of a full-scale power plant serving a single customer.
Historically, utilities treated data centers as incremental load additions, folded into long-range forecasts. That approach no longer holds. In its 2026 Load Forecast Report, PJM projects “net energy load growth averaging 5.3% per year over the next 10 years”. For a mature electricity market covering 13 states and Washington, D.C., sustained growth at that rate represents structural acceleration rather than cyclical volatility.
Grid operators are explicit about the pressure. Reuters notes that “PJM Interconnection and ERCOT, two major U.S. grid operators, warn of imminent shortfalls and excessive connection requests”. The interconnection queue has become a strategic chokepoint. Data centers can be financed in months; transmission upgrades and substation expansions take years.
The macro data reinforces the trajectory. The International Energy Agency projects that global electricity demand from data centers could reach approximately 945 terawatt-hours by 2030—nearly double today’s levels. Pew Research reports that data centers accounted for about 4% of total U.S. electricity consumption in 2024 and that demand is expected to more than double by 2030. These are system-level numbers that reshape infrastructure planning horizons.
The politics have caught up.
The industry’s response has shifted accordingly. Reuters reports that “over 46 data centers plan to build gas-powered plants, with self-sufficiency potentially becoming a requirement”. What once would have been viewed as vertical integration overreach is now framed as prudent risk manag
President Donald Trumpis allowing major technology companies to build their own power plants to support their data centers. That is easier said than done.
The technology sector operates on compressed software cycles — rapid releases, iterative deployment, and capital flowing quickly into new capability. Energy infrastructure follows a different tempo. Transmission projects involve multi-year studies and permitting. Gas turbines have extended manufacturing lead times. Substation construction requires coordination across jurisdictions.
Even self-generation is constrained by industrial capacity. Utility Dive reports that GE Vernova’s gas turbine backlog extends years into the future, with expectations that orders could be effectively sold out through 2030. E&E News reports that Siemens Energy is investing $1 billion to expand U.S. turbine and grid manufacturing facilities in response to surging demand. These are long-cycle investments responding to long-cycle bottlenecks.
For a decade, hyperscalers built the cloud on the assumption that infrastructure would scale elastically. Compute and storage were virtualized abstractions, invisible to end users. Today, physical infrastructure has reasserted itself. Megawatts per model iteration is emerging as a meaningful planning metric.
This introduces a new layer of competitive differentiation. Where you build now matters as much as what you build. Regions with surplus generation, shorter interconnection queues, or regulatory clarity become strategic assets. Site selection becomes grid arbitrage.
The financial dimension is evolving as well. Barron’s notes that technology companies are increasingly funding dedicated infrastructure and long-term agreements that make certain utilities major beneficiaries of AI expansion. The contract structures begin to resemble power purchase agreements as much as software service agreements. Cloud companies are co-developing infrastructure.
AI was marketed as a path to efficiency—smarter logistics, optimized manufacturing, automated workflows. Yet the act of training and serving these systems consumes substantial physical resources. Application-layer efficiency coexists with infrastructure-layer demand growth.
The grid responds to load, not to narrative. An AI company can raise money on a great story, Sam Altman has proved that, although even he may be hitting his limits.
If residential electricity rates rise in regions with high concentrations of data centers, regulators will intervene. Cost allocation proceedings are intensifying. The political system is unlikely to allow hyperscale computing to be indirectly subsidized by household ratepayers. In fact, many municipalities are already rebelling against large data center projects. Expect this to be a major campaign issue in 2026, especially in rural areas that are the prime targets for data centers.
This tension creates incentives for hybrid strategies: on-site generation, long-duration storage, potential small modular nuclear deployment over the longer term, renewable portfolios paired with firming capacity, and demand-response participation in exchange for priority interconnection.
In that environment, success may not favor the most advanced model laboratory. It may favor organizations that manage infrastructure risk with discipline.
The constraint is not capital. It is not software talent. It is not venture appetite.
It is available megawatts.