The AI bears are having a good week, but they have a long year ahead of them.
The industry spent last week digesting President Donald Trump’s inauguration — and the Stargate announcement that includes a pledge for $500 billion in spending on digital and energy infrastructure.
Then early Monday morning, DeepSeek, the product of Chinese hedge fund High-Flyer Capital Management, released its R1 reasoning model that demonstrated some measures of performance on par with U.S. rivals like OpenAI, at a fraction of the cost to build and train — and operating with significantly less power demand.
This was terrible news for Nvidia, given its valuation is tied to forecasts of long-term demand for its AI-specific GPUs. Energy companies like Vistra and NextEra took a plunge as well. But it’s potentially welcome news for those concerned about how powering hyperscale data centers could boost emissions.
AI bears, who see all the hallmarks of an infrastructure bubble developing, could reasonably greet the Stargate announcement — which came from Trump, notoriously bubble-adjacent investor Masayoshi Son of SoftBank, and leaders from OpenAI and Oracle — as a clear “top” to the market.
There have not been any strong indications of how that half-trillion will be recovered profitably, and which business models will reward the massive infrastructure investments that big tech companies are making in 2025. Instead, there are plenty of barriers to profitability, including quality data for training, the availability of land, water, chips and power, and greater pull from large enterprise users.
And now, the market is facing a possible downswing.
DeepSeek, thanks to export controls, had to innovate around the constraints of lower-end chips. In doing so, the Chinese company showed you didn’t need huge clusters of GPUs. The release of DeepSeek’s R-1 model decoupled scaling performance from scaling compute and energy resources in parallel. And that success, despite a number of lingering unknowns, was startling enough to bring into question many fundamental assumptions about the market.
In a Substack post, Azeem Azhar explained that DeepSeek’s reasoning model “performs about as well as OpenAI’s o1 reasoning model but is about a tenth the cost. DeepSeek’s non-reasoning model, V3, is similarly disruptive. About as good as GPT-4o but one-fifth the price.”
The energy market reaction
After taking a hammer to Nvidia’s stock, investors fled the likes of Constellation, Vistra and other power providers. The reasoning is understandable in the near term: if AI performance isn’t tied to the quantity of high-end, power-hungry chips, then forecasts for both chip deliveries and power demand are now in doubt.
Let’s look at the implications for power demand. Nearly every forecast assumed AI data centers would be the dominant driver of U.S. power demand, accounting for upwards of 75% of growth projections, with the remainder tied to electrification and the onshoring of industry.
DeepSeek addressed that need for power in at least two ways: 1) a range of software optimizations that vastly improves how the model operates, and 2) clever engineering built around the constraints of chip availability that led to a more efficient design. The company has said that the model operates effectively at one-tenth the cost and power of current models.
This is a big deal. And it’s made even bigger by the fact that DeepSeek’s model is open source, so developers around the world have access to the code, not just APIs.
If the big tech companies of the U.S. follow suit and can quickly build models that dramatically reduce their reliance on chips and power, it would follow that power demand forecasts need to be updated yet again.
Where we go from here
In a power market already distorted by AI’s demand, what are some of the possible outcomes? A few to consider:
- Data center speculators back away from the market. That could “clean up” some of the interconnection queue, making it easier to gauge what’s real.
- Hyperscaler price insensitivity could rationalize. There have been a number of power contracts in the $100 MWh range or higher in the past year, which prompted excitement from everyone from emerging geothermal developers to shuttered nuclear plant owners. If pricing at this level can’t be sustained, established solar, wind, and natural gas providers will own much of the market.
- Demand for AI will likely increase. Demand for tech is highly elastic, so at a lower price the market will expand. Many enterprises deterred by the price may embrace lower-cost models, which will in turn spur demand. This is at the heart of discussions around the “Jevons paradox” this week, as people assume we will increase adoption of AI because of improved efficiencies and costs. It should be noted that this takes a while to play out, so an AI infrastructure bubble, if there is one, may still burst.
- The AI arms race with China only intensifies, and the Trump administration continues to put its support behind AI providers in the U.S., driving infrastructure investments for the sake of national security and competitiveness.
Policy matters here. The Trump administration has already explicitly thrown its weight behind the natural gas market, lifting the moratoriums on new export permitting for LNG and generally expediting all reviews of LNG projects across the country.
But there is policy at work to support fossil fuels in this market as well. Trump has elevated AI to a matter of national importance, and through executive order is trying to make it easier to build both data centers and energy supply through an easing of regulatory and permitting requirements. That energy, according to Trump’s speech at the World Economic Forum, should be natural gas. In a surprise embrace of specificity, Trump suggested these gas plants be built alongside data centers as on-site generation. Supply and demand: they’re ordering both.
The bears may be right that the frenzy around building AI infrastructure in the U.S. just took a much-needed gut punch. But there are no indications that the potential value and utility of AI has been called into question. In fact, the technology has only gotten easier to build — and we still need lots of chips and data centers for inference, not just training models.
What should continue to worry anyone concerned about emissions is Trump’s enthusiasm for gas as the energy resource for any new power demand, and the challenge facing renewable energy developers as they look to compete in this new environment.
A version of this story was published in the AI-Energy Nexus newsletter on January 29. Subscribe to get pieces like this — plus expert analysis, original reporting, and curated resources — in your inbox every Wednesday.


