There are many ways to measure an AI company’s prowess, but in 2025, speed to power was the indisputable benchmark. The hyperscalers are still citing capacity as their limiting factor in every quarterly earnings call, and nobody expects that constraint to let up in the near future. The largest tech players are on track to spend over $400 billion collectively in 2025, pouring capital into land acquisition, grid infrastructure upgrades, and bespoke generation assets.
But despite their massive spend, extensive lobbying, and creative purchasing strategies, the hyperscalers weren’t the ones to really set the tone on speed to power in the last 12 months.
The company that is bringing power — and therefore chips — online fastest is xAI, said Michael Thomas, founder and CEO of energy data company Cleanview, which is tracking around 1,000 data centers in the U.S., both planned and operating. Elon Musk’s “Colossus” supercluster in Memphis, Tennessee has become the AI industry’s benchmark for velocity, albeit at a steep environmental cost.
Colossus began operating in mid-2024, with an 8 megawatt grid connection plus a dozen or so methane-fired turbines onsite to cover the remaining load. In 2025, the company expanded further, installing at least 20 additional turbines to power its 100,000 GPUs.
The strategy got new computing power online in just 122 days; it also sparked immediate backlash over smog and noise pollution. Over the summer, local officials granted xAI a permit to operate a certain number of turbines until 2027 — though litigation over the additional, unpermitted turbines is ongoing.
“This year, it became really clear that there were tradeoffs in terms of speed to power and environmental impact,” said Thomas said, adding that the respect that Musk still garners from entrepreneurs and tech leaders, even in the wake of his disastrous sabbatical to Washington to slash the federal workforce earlier this year, is key to the impact that xAI has had in the speed to power conversation. “He sort of moved the Overton Window window of what’s acceptable.”
The Silicon Valley trope of “move fast and break things” really became a critical theme in the race to power, Thomas said. Colossus embodies that approach more than any other project: getting what was at the time the world’s largest data center up and running on a timeline previously unheard of in the industry.
“That set the tone in a lot of ways, and now everyone else is playing catch up in terms of speed to power,” Thomas added. “It shocked everybody and made them realize they’ve got to do something different.”
The reverberations of the Colossus shock, he said, resulted in every data center company calling up their existing contractors and asking how they could move faster — a lot faster. “Elon Musk’s approach to business is to really push on everything and test the assumptions,” Thomas said. Now, data center developers are taking the same approach, pushing on every vendor and contractor and partner to see who can move more quickly.
A procurement spectrum
Despite xAI’s impact, the default of power purchase agreements for wind and solar remain the most common power strategy in the AI race, Thomas explained. PPAs have been a key part of the power playbook for Big Tech for a decade, and they still make up the bulk of procurement by volume.
But there are a handful of other strategies emerging. A second approach involves the longer-term projects slated to come online in the next few years, such as Microsoft’s deal to restart Three Mile Island and Google’s hydroelectric licensing deals. Those projects aren’t speculative or risky, Thomas explained. They’ll take a few years to finalize, but they’re almost guaranteed to come online. Many fossil gas projects fit into this category, he added: tried-and-true tech that takes a while to build given procurement slowdowns and permitting challenges, but offers guaranteed firm power.
Then there’s a “far out” strategy of headline-generating agreements and investments in uncommercialized power sources that may not materialize on a timeline relevant to the AI race. In this category Thomas points to Amazon’s bets on nuclear fusion and SMRs, and Microsoft’s deal with Sam Altman-backed Helion Energy.
That said, the data center buildout is on a spectrum. xAI sits at one end, as the “fastest and worst for the environment,” Thomas said. Microsoft and Google sit toward the other end in terms of environmental attributes, focusing on complex carbon-matching and flexibility efforts. Amazon and Meta fall somewhere in the middle, developing large gas plants next to their mega data centers to ensure reliability while still buying renewables.
On other metrics, xAI doesn’t stand out from the crowd. Google, for example, has set the tone on flexibility, pioneering the “clean transition tariff” in partnership with NV Energy, deploying battery storage to mitigate load spikes, and hiring Duke University researcher Tyler Norris, whose paper on potential overhead on the grid set off the flexibility conversation earlier this year.
Amazon, meanwhile, remains the heavyweight champion for sheer scale. In the last 12 months, AWS added 3.8 gigawatts of data center capacity, significantly more than the 2 GW of its closest follower, Microsoft. xAI, for its part, has only added around a single GW, Thomas said — though, of course, they did it incredibly quickly.


