Peter Freed, the former director of energy strategy at Meta, on how he views the decarbonization challenge for data centers in the AI era.
Photo credit: Shutterstock
Photo credit: Shutterstock
Two years after the start of the AI arms race, data centers are suddenly under intense scrutiny as power demand from warehouse-scale computing outpaces new renewable energy supply, creates grid connection backlogs, and takes utilities by surprise.
But we aren’t really feeling the full energy impact of the AI era yet.
“Most of the data center announcements that have been made this year have probably been in the works in development for 12 to 18 months, which means that most of them predate the craziest gen-AI signal inside companies,” Peter Freed, Meta’s former director of energy strategy, told Latitude Media.
Data center capacity was already growing quickly before the AI arms race kicked off two years ago. Much of the newly-planned capacity to serve AI workloads – large GPU clusters that require far more energy to operate – will likely start coming online in 2027 and 2028.
“That's when there's going to be a big bump in data center related load. And there is not sufficient generation to meet all of that,” said Freed.
Freed, who left Meta in April of this year after a decade at the company, now consults with a range of tech and energy companies that are grappling with how to decarbonize the rapid expansion of new data centers. (Freed will also be a keynote speaker at the upcoming Transition-AI conference.)
He believes there’s a very small window of opportunity to get it right.
“Under the rosiest scenarios for advanced clean, firm technology – mostly nuclear but also geothermal and long-duration storage – I think at scale we're probably looking at the best case in the early 2030s,” said Freed.
There are 22 gigawatts of planned data centers in the US, according to JLL. That means the period between 2027 and 2030 is when a lot of new capacity will get built – and “a lot of hard tradeoffs are going to be made.”
The tradeoffs are already happening. Even as large tech companies dominate wind and solar power procurement, long interconnection queues, local resistance to projects, and equipment supply constraints are capping development.
Meanwhile, utilities are proposing many gigawatts of new fossil gas plants to meet data center demand. And old coal plants are staying open longer to serve new demand.
Freed worries deeply about grid constraints and building more fossil fuels – “I am concerned because I've been a climate person for my entire career” – but he thinks the data center problem is sometimes framed in unhelpful ways.
“We're talking about load growth and the constraints that it creates in a very general way,” he said.
Data centers accounted for 4 percent of US electricity use in 2023. Even as capacity additions doubled in the last decade, the sector’s electricity consumption grew incrementally. But Morgan Stanley expects electricity demand to double by 2030 thanks to AI workloads. The Electric Power Research Institute also came to a similar conclusion.
This growth is large and introduces new complexities. But we shouldn’t view it as an existential threat to the power system, says Freed.
“Some people are like, ‘oh, we're running out of power. What are we going to do?’ And the power system is very complicated and I think those constraints are much more local and specific,” said Freed.
While a growing number of gigawatt-scale facilities are being planned, new hyperscale data center capacities typically fall closer to 100 to 250 megawatts. Integrating those facilities on the grid rarely presents an energy constraint – it’s often a capacity constraint.
Those capacity constraints often create challenges on summer afternoons or cold winter mornings a handful of times per year, not all year round.
“The solution is not necessarily to go build more generation. Sometimes it is, but a lot of times it’s reconductoring a line with a higher capacity wire, or putting a battery in. This is really where grid-enhancing technology shines a lot.”
This is why the Department of Energy has been pushing virtual power plants and grid-enhancing technologies as critical to meeting the demands of a “peakier” grid system. And it’s also why companies like Google have been investing in demand response capabilities for data centers, in order to shift computing workloads during critical periods.
Of course, we still need a lot more clean, firm generation. And this is where Freed believes the conversation is also misguided.
There is a growing dispute in Ohio over who should pay for the grid upgrades to support new generation for data centers – a battle that could spill into other states.
“People are saying, ‘oh, data centers should just pay for any new generation that needs to be built to serve them. And I don’t necessarily think those conversations actually help. We need innovative and creative thinking,” said Freed.
Unless a facility is islanded, it’s rare that new generation only benefits a single load. Freed believes we need to consider the larger grid benefits of new, clean power plants – and the upgrades to handle them. “Some of the conversations seem to miss that.”
Data centers are part of a larger load growth picture, driven by electrification and manufacturing onshoring. They present an immediate challenge that exacerbates constraints on the system, but “even under the most aggressive scenarios, data centers are a low double digit percentage of the whole,” said Freed.
He believes that data centers can help prepare the U.S. power sector to handle higher overall load growth, as argued in an RMI analysis in May.
“We need to be taking a long-term perspective, such that solutions that might work for adding data center load in the near term open a window to making sure that we are ready for all of the things that we are electrifying – in part because of the many federal policies that incentivize electrification across the economy.”
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
Join industry experts for a one-day conference on the impacts of AI on the power sector across three themes: reliability, customer experience, and load growth.
The industry is getting creative as it scrambles for power.
Google and Meta are investing in advanced geothermal projects. Google has developed a new tariff structure with NV Energy to support technologies like geothermal – without passing the premium on to ratepayers.
Big tech companies are also sending a strong demand signal to the nuclear industry.
Last week, Constellation Energy announced a 20-year power purchase agreement with Microsoft as it seeks to re-open the Three Mile Island nuclear plant in Pennsylvania.
“But those are near term opportunities, but they're not to the scale that we're going to need,” said Freed.
This is where club buying can play a role – similar to the advanced market commitments made by Frontier Climate for carbon removal. A recent partnership between Google, Microsoft, and Nucor to stimulate demand for advanced nuclear has “some flavors of that,” explained Freed.
This kind of advanced commitment is particularly important for nuclear, where a strong market signal is needed to kickstart the supply chain. “I think more and more people are circulating this idea. How do you demonstrate there’s enough of a market that we actually get the ecosystem in place.”
These investments – assuming they don’t support just controversial behind-the-meter deals – could bring benefits to the wider grid.
Meanwhile, there’s still a question about what the demand curve of data centers will actually look like.
“Generative AI was a science fair project to many companies until 14 to 16 months ago. Now you have the full weight of the best engineering teams on the planet working on these algorithms, and I think we will blow algorithmic efficiency through the roof. So to me, I'm not sure I see this going up asymptotically to the right forever,” said Freed.
Peter Freed will join us at Transition-AI: Washington, DC on December 3rd. Other speakers include Page Crahan, general manager of Tapestry, Google X’s moonshot for the grid; Helena Fu, DOE’s chief innovation officer; and Chris Shelton, chief product officer at AES. See the agenda here.