It was the year that Silicon Valley’s “move fast and break things” approach collided with the U.S. grid: immense, bureaucratic, and largely built nearly a century ago.
Hyperscalers have rushed to build data centers to power artificial intelligence, spiking load growth forecasts to record-high levels. But they’re running up against a patchwork of outdated grid rules that aren’t designed to add so many massive power users — nor the new transmission and generation the new load requires — in the quick timeframe companies are demanding.
The scenario has led some energy analysts to predict capacity shortfalls by 2028 in regions like PJM and ERCOT. Faced with that looming crunch, these large power users have begun prioritizing novel speed-to-power solutions, redefining how the energy transition unfolds as a result.
The debates at the Federal Energy Regulatory Commission over whether data centers can be flexible grid assets. Hyperscalers building their own generation. A raft of new companies like Fermi and Creekstone buying up land out West and promising to supply power, initially with gas generators, to hyperscalers fast. Big bets on clean baseload technologies both imminent and further off, from a major fundraise by geothermal developer Fervo Energy, to Trump Media’s unusual merger with the fusion player TAE Technologies.
These developments reflect how the energy transition, once largely driven by climate policy, is now defined by the scramble for firm, fast power. And the great electron race shows no signs of slowing down — even as chatter of an AI bubble grows.
The largest tech players combined were on track to spend roughly $400 billion in 2025, with the majority dedicated to AI infrastructure including chips, grid upgrades, and bespoke energy assets. Goldman Sachs analysts estimate hyperscalers’ capex may be more than $500 billion this year.
Tech giants, meet FERC
The role of FERC transformed in 2025 thanks to the speed-to-power push. An agency historically focused on interregional transmission and gas pipelines is now taking steps to speed up the connection of AI data centers, partly due to a directive from the Trump administration.
In October, Energy Secretary Chris Wright asked FERC to expedite the connection of large loads that agree to be curtailable, such as data centers, to the transmission system.
The request was part of a broader proposal to expand federal regulatory authority over interconnection, which has traditionally been left to states. The queue has become backlogged with requests from AI data centers and manufacturers, which Wright said requires “prompt attention.” Otherwise, the delay will hamper “a new era of American prosperity,” the order read.
The directive has led to more questions than answers, including how load flexibility would work in practice. Other major flashpoints among hyperscalers, utilities, grid operators, and consumer advocates who responded to the docket include who should get priority access to the grid and how to pay for it. The debate will continue to play out in 2026. Wright directed FERC to propose a role by April, which would be lightning speed for the agency.
FERC took another step aimed at speeding up AI data center development just two weeks ago. On December 18, the agency ordered PJM to overhaul its rules to allow data centers to colocate at power plants — potentially without waiting for a build-out of new transmission lines. It was a win for companies including Constellation Energy and NextEra, which plan to restart mothballed nuclear reactors solely to power AI data centers by Microsoft and Google, respectively.
Bring-your-own
Some companies are finding ways to skirt the gridlock by bringing their own solutions, shoring up extra capacity from distributed energy resources, or even going off-grid altogether by building behind-the-meter power plants.
xAI — Elon Musk’s AI company that trains and operates Grok — spent 2025 bringing massive compute and power online in record time at its Colossus facility in South Memphis, Tennessee. The campus was able to do that using a hybrid power strategy: direct grid connection paired with massive on-site generation from gas turbines. It comes with a steep environmental cost for local residents forced to breathe the resulting air pollution; Colossus installed at least 20 additional gas turbines this year, with plans to expand even more by early 2026.
A similar approach was deployed at the first site under Project Stargate, the network of AI data centers underway by OpenAI, Oracle, and Softbank. The campus in Abilene, Texas, which came online this year, has a grid connection and a newly constructed gas plant. But a second location in Shackelford County, Texas is expected to be completely off-grid, with up to 1.4 GW of gas generators.
For companies going behind-the-meter, gas generators and fuel cells are often the technology of choice, mainly because they’re available now and can be a bridge until the grid is less congested. Data centers are also embracing batteries for on-site power, and to secure a faster interconnection.
Out West, several projects plan to use gas turbines in the short-term, then explore a mix of energy sources like solar, battery storage, and SMRs to build upwards of 10 gigawatts of capacity later on.
Fermi Inc., the startup backed by former energy secretary Rick Perry, claims it will have the country’s largest combined-cycle gas projects at an AI data center campus in Texas. The company made headlines in October for its $16 billion IPO, in part because it doesn’t have any public customers or revenue yet. But in December, Fermi’s share price plummeted by 34% in a single day, after the company said a prospective tenant terminated a $150 million deal to help fund construction. Toby Neugebauer, Fermi’s billionaire CEO, told Business Insider that Amazon was the tenant but then ultimately walked back the statement.
Flexibility
While major power users have long mulled load flexibility, the concept got a major boost in February, when a group of Duke University researchers released a paper finding that the grid could have more than 100 GW of extra capacity — if only large loads are willing to flex their energy use.
The paper collided with the speed-to-power rush, contributing to a growing wave of interest in the concept. A new ecosystem of companies and pilot projects have emerged, including a partnership between Nvidia, Emerald AI, EPRI, Digital Realty, and PJM to build the world’s first power-flexible AI campus. The goal, relying on software developed by the startup Emerald AI, is to match certain training jobs to times when the grid has spare capacity. The group’s first facility, a 96-megawatt project in Virginia, is slated to open in the first half of 2026.
Hyperscalers hope that by agreeing to curtail their energy use, grid operators might let them skip to the front of the line in interconnection queues. Whether that’s possible practically speaking, however, is up for debate; one particularly skeptical report from PJM’s independent market monitor called data center flexibility a “regulatory fiction.”
“The IMM is rightfully raising that we haven’t solved how to serve data centers in isolation,” Julia Hoos, who covers the eastern U.S. at Aurora Energy Research, told Latitude Media in November. “Nobody thinks about PJM until the lights go out or the costs skyrocket, and that’s what’s been happening this year…We tend to forget that there is no way to supply data centers with power without impacting everyone else that relies on the grid.”


