The tech industry is pouring $1.3 trillion into data centers globally over the next five years. While efficiency breakthroughs like the launch of DeepSeek’s R1 reasoning model might reduce computing needs, the sheer scale of AI deployment means we’re still facing historic demand growth. Data center electricity consumption doubled under Biden — and it’s projected to triple by 2030.
In this episode, we examine how utilities, tech companies, and policymakers are grappling with the wave of data center development. We explore why the “mega-campus” model is giving way to smaller building blocks, how grid constraints are reshaping data center deployment, and why all new generation — whether it’s solar, nuclear, gas, or geothermal — converges at $100 per megawatt-hour.
Then, we sit down with Peter Freed, Meta’s former director of energy strategy, who explains how tech companies evolved from building single data centers to managing massive power portfolios. He shares insights about the critical window between 2027-2032 when data center load will hit the grid alongside broader electrification, and why that’s driving new interest in nuclear, geothermal, and grid-enhancing technologies.
Along the way, we tackle some big questions: How are utilities handling the flood of speculative interconnection requests? What does Trump’s $500 billion Stargate project mean for grid infrastructure? And most importantly: who’s going to pay for all of this?
Credits: Co-hosted by Stephen Lacey, Jigar Shah, and Katherine Hamilton. Produced and edited by Stephen Lacey. Original music and engineering by Sean Marquand.
Transcript
Stephen Lacey: So we’re now in this era of the Trump administration where every day brings multiple fresh news cycles. So we’re going to have to timestamp this conversation. Katherine, can you please state the date and time?
Katherine Hamilton: Yes. It’s Wednesday, February 5th at 10:15 AM Eastern Time.
Stephen Lacey: This episode was recorded far ahead of time to accommodate my ill-timed vacation. And no, I am not going to the Riviera on the Gaza strip.
Jigar Shah: Oh my. Too soon. Too soon, Stephen.
Katherine Hamilton: You may not need a passport to do it.
Stephen Lacey: That joke is going to be old by the time this comes out. From Latitude Media, this is Open Circuit. This week: the data center boom gets complicated. Trump’s $500 billion Stargate Project promises to accelerate AI infrastructure just as a breakthrough in China raises questions about how much computing power we actually need. Spoiler: we’re still going to need a lot. Plus, my conversation with Meta’s former energy strategist about how tech companies are reshaping power markets and the small window of opportunity to get it right.
I’m Stephen Lacey, executive editor at Latitude Media. I’m joined by Jigar Shah and Katherine Hamilton. Jigar is a clean energy investor and former director of the DOE’s Loan Programs office. Hello Jigar.
Katherine Hamilton is the co-founder and chair of 38 North Solutions. Hi Katherine.
Katherine Hamilton: Good morning.
Stephen Lacey: I didn’t see either of you at the front row of Trump’s inauguration.
Jigar Shah: Well, it was the camera angle. You just missed it. I was in the blind spot.
Stephen Lacey: Oh, so you were sitting next to all the tech execs?
Jigar Shah: Well, me and 374 of my closest friends, whatever good fit in the rotunda. Right.
Stephen Lacey: I bring that up because at Trump’s inauguration last month, the tech industry’s most powerful executives filled seats that were typically reserved for foreign dignitaries. And it didn’t take long to see why. Within days Trump was in the Roosevelt room making this announcement.
Donald Trump: These world leading technology giants are announcing the formation of Stargate. So put that name down in your books.
Stephen Lacey: Trump, OpenAI, Oracle and SoftBank pledged 500 billion to build the world’s largest network of AI computing infrastructure and the energy infrastructure to support it.
Donald Trump: I’m going to help a lot through emergency declarations because we have an emergency. We have to get this stuff built, so they have to produce a lot of electricity and we’ll make it possible for them to get that production done very easily and that will be incredible.
Stephen Lacey: This is part of an unprecedented wave of spending. Tech companies are on track to pour $1.3 trillion into data centers globally over the next five years, and the biggest bottleneck is power. But days later, something big happened. This Chinese company called DeepSeek released an AI model that performs as well as US models at one-tenth of computing power and energy consumption, and tech stocks and utility stocks took a hit as investors questioned whether we need all this infrastructure. And once again, it raised this question: just how much will data centers drive electricity demand? And with the Trump team putting AI dominance at the center of its policy agenda, how is it going to feed their obsession with building more fossil fuels?
So this is the perfect time to take stock of this issue. And Jigar, your initial take on the DeepSeek news on X, in my view, is the right one. Even if DeepSeek’s claims are true about its modeling, you still see this enormous wave of demand coming. Walk us through your thinking.
Jigar Shah: Yeah, I think that there’s just so much noise right now, whether it’s around, “Hey, are we going to power this with solar or nuclear or wind or battery storage or geothermal or whatever,” or if this growth is coming from cloud, which is mostly the growth now, or whether it’s coming from AI. We’ve got about 25,000 megawatts of new load that is credibly signing contracts for 2030 delivery or before right now. And 25,000 megawatts is not a lot per se, but 25,000 megawatts at 24 by seven is highly disruptive, particularly if it’s 1000 megawatts at a time in a very specific place. The grid was not built for that. And so figuring out how we accommodate those kinds of loads continues to be front of mind for me and front of mind for many colleagues in the clean energy industry who are trying to figure out how to bring solutions to scale.
Stephen Lacey: And what’s the biggest constraint there? Is it a grid capacity problem or is it an energy availability problem?
Jigar Shah: Well, the grid capacity is part of the problem. As all of our listeners I think know, we have plenty of grid capacity most of the time. It’s just part of the time in the summer when air conditioning loads are high or part of the time in the winter when heating loads are high, that you end up with several hours of constraints. And the question is how do you ride through those constraints? Do you dial back the data center load? Do you put in a whole bunch of batteries? Some of the folks are putting in diesel backup generators. There’s all sorts of solutions.
But the question really becomes that when you add solutions for those—pick a number, a hundred hours or 200 hours of constraints—who also pays for all of that? Is that something that you give the data center companies the low-low industrial discount rate and then you socialize the costs of all of the additional infrastructure, which is what we’ve been doing for 10 years? Or are we saying, “Wait a second, we shouldn’t be raising rates for poor people just because these data centers are constraining the grid for a hundred hours a year.” And so who pays for that? I don’t think the technical solutions are impossible, although there is planning work and work to be done. But I do think who pays for it is something that is really top of mind right now.
Stephen Lacey: Yeah, absolutely. I want to get to that and then some of the regulatory approaches over how to fund the infrastructure. Katherine, we talked about this not too long ago when I was hosting The Carbon Copy, and a lot has changed since that conversation. I think I sort of framed it around “is this an overhyped or underhyped issue?” What was your answer then and has it changed?
Katherine Hamilton: It’s funny, I went back and listened to it and I said that it was overhyped because I actually think there are solutions that Jigar alluded to. And one thing I’m watching is that planning process and where are these facilities going to go? And that will lead us to the answer of energy or capacity. And part of this is that I live in Virginia. So Virginia has—Northern Virginia has the most data centers of anywhere and they keep building them on farmland, much to people’s chagrin. A lot of the local folks’ chagrin—keep building them on farmland and in clusters. And that’s because data centers have latency issues. They have to be put in clusters, but these AI, large language models don’t have as much of the latency issue and it will be much easier for them rather than trying to build these mega campuses which do require a lot of resource and that has to be 24/7, but really able to make them into building blocks of 200 megawatt increments, then there’s less of an issue of them all being together and where can they be placed, where can they be placed where the rates are cheaper, where the resources are there.
For example, if you have a lot of wind somewhere, maybe then siting some batteries would be a much cheaper, easier answer for building a data center. So I think we do have to look at what is the planning and is there some headroom in our natural gas infrastructure? What are the local constraints? Because so much of this is going to have to do with local pollution and you can’t just put gas plants everywhere. And turbines, by the way, there’s a shortage of, and they’re going to be harder and harder to get. So I think we do have to focus on more of a distributed—and I mean that in two ways. One is distributed centers and also just distributed assets. What do we have on the distributed side of the grid that we can bring to bear? What can we get out of the grid that we already have? Every electron that you can get? And for example, LineVision did a bigger report on how grid enhancing technologies can help on squeezing out those extra electrons and then where can we put them where customers will be least impacted by those rate increases. Rates are going up everywhere.
Stephen Lacey: And I mean I think that 200 megawatt building block emphasis is correct because of capacity problems on the grid. There are not that many places where you can build a one to five gigawatt data center in the US. And we’re hearing some of the leading hyperscalers say that there’s demand for multi-gigawatt facilities, but there are not that many places you can build a facility like that. Jigar, what do you think about the size of these facilities and what will constrain them on the power side?
Jigar Shah: Yeah, I think that it’s important to start with a cost base. I think one of the big problems that we have right now is that with over 950 new manufacturing facilities that have already been announced, and half of them are under construction here in the United States, a lot of this one megawatt, three megawatt, five megawatt excess capacity has been used. Someone signed up for it, they’re manufacturing something, they’re building something. And so when we talk about 25,000 megawatts, it’s not a lot, but it’s actually at a time when the grid is already strained and not just the electricity grid but the natural gas grid. So the natural gas grid is at capacity as well around the country.
And so one of the things that I’m finding, and I know many of us know Brian Janous and he’s in his new company doing a lot of that work, trying to find the one to five gigawatt sites that you can place stuff at.
Stephen Lacey: So Brian is the former VP of the energy business at Microsoft.
Jigar Shah: And what he’s saying, and I think what a lot of folks are saying is that everything is a hundred dollars a megawatt hour. And so I think it’s important to recognize that if you build solar plus battery storage with natural gas backup in West Texas, like Intersect announced with Google, or you decide that you’re going to build a new geothermal facility like Google did with Fervo in Nevada, or you decide that ExxonMobil’s excess natural gas production capacity in the Permian is the right place to put a two gigawatt data center with carbon capture, or you decide a new nuclear plant is what you want to build—it’s all a hundred dollars a megawatt hour. And so nothing is cheap. When people are like, “Well, but solar is so cheap,” it is not cheap because by the time you add eight hours of battery storage and then you add natural gas backup and all the things that you have to do, it’s always a hundred dollars a megawatt hour.
So we’re not talking about something where there’s excess capacity and it’s just, “Well, we just have to tweak it over here and put a little LineVision in here and put a little bit of this, a little bit of that.” When you decide to do one to five gigawatts of capacity in one location, it’s highly disruptive and it costs a lot of money to cover that load. So then once you decide that that’s what it costs and you’re not arguing over that anymore, then you’re like, “Well, what does the grid need from a planning perspective?” Because you’re retiring coal, you’re retiring old natural gas plants, you’re retiring other things.
So that’s why there’s this conversation around if you’re going to bring this much disruption to the grid, well, why not build a new nuclear plant? And then we have clean firm power generation that can replace some of the retiring clean firm power generation. And that’s why you’re seeing on January 17th, there was a deadline for bids for $800 million worth of grant funding for the initial steps of nuclear, and it was oversubscribed. There’s a bunch of people who were bidding on it and all of them were sites that wanted to host one to five gigawatts of new data center load. So those are the only people who are bidding on that money.
Katherine Hamilton: What you don’t want to do is overbuild like Calpine did in 2017. They just overestimated what they would need. And I think rather, yes, if you’re looking at one to five gigawatts at a time, yes you need a really serious planning process, but if you’re doing these building blocks of 200 megawatts, there are a lot more places you can put those where you have wind that’s already being curtailed, other types of supply that are cheaper that can get you what you need at a certain location. And I think in the end, they’re going to go where it’s cheaper, right?
Jigar Shah: It’s still a hundred dollars a megawatt-hour. Even with curtailed wind in Iowa or Kansas or South Dakota, it’s between eight and 12 hours of battery storage that you have to put there. And if you put in Form Energy batteries, then it’s more money. It’s all expensive. I just think that people have to stop thinking that any of this is cheap. None of it’s cheap. All the cheap spots are gone. People have been mining that stuff since 2021. Those locations are under construction and already online.
Just to give people some context, during the four years that Biden was in office, data center load growth doubled right from like 85 terawatt hours to roughly 170 terawatt hours, and now we’re going to double again to roughly 350 terawatt hours by 2030. And so all the cheap stuff is gone. No, yes, you can do one megawatt, five megawatts, 10 megawatts here and there with curtailed power, but if you want to do 200 megawatt blocks or 1000 megawatt data centers, there’s no cheap stuff out there.
Stephen Lacey: Let’s talk about the Trump administration and how it’s thinking about this. Certainly the conversations inside the Biden administration were very different from what is going on inside the Trump administration now. Katherine, when you look at where AI infrastructure sits in the president’s energy dominance agenda, where does all this fit in?
Katherine Hamilton: Yeah, it’s huge. I think it’s the topic of every conversation in Congress and very likely in the White House and in the administration, throughout the administration. It’s all about China right now and about energy dominance over China. But in the end, one of the things that Jigar keeps saying is affordability is going to be massive because this administration also promised to lower costs to customers of all kinds of goods and energy is one of those extremely expensive goods, and that is going to take a huge hit on the affordability side if we’re not cautious about that.
So I think there’s still a lot of resources out there. I went back to the DOE website and they have a whole hub on electric demand growth resources and they have all these papers. Now, if you click on the “clean energy resources to meet data center electricity demand,” it takes you immediately to the restoring energy dominance homepage. But there still are a lot of good resources on there, including the Secretary of Energy’s advisory board during the Biden administration where these advisors recommended several things like look at all the power dynamics of these centers and try to figure out what you need to actually provide them. How do you optimize the grid and contribute to peak load management, for example? And then look at what generation storage and grid technologies look like, and all of the technologies that Jigar and the other programs at DOE funded and have funded over the decades have yielded solutions to this issue. And now we just have to execute on those. And I don’t think those solutions are going to necessarily change because you still have a very healthy infrastructure for grid growth. Of course, that needs to happen a lot faster, but also for all these clean energy technologies.
Stephen Lacey: I know Jigar, you want to talk about the affordability piece, but I just wonder if the tech companies are really going to hold the line on clean energy in the Trump era. Seeing signs of a shift. So certainly all these companies are still sticking to their zero carbon commitments. I think they really genuinely do care about procuring as much clean energy as possible. But when you look at Meta’s massive gas procurement or gas contract in Louisiana, when you see that this Stargate announcement, Sam Altman has been talking about nuclear and clean energy, but suddenly the first project under Stargate is getting served by a large order of natural gas turbines. I wonder if it opens up, it creates a license for these tech companies to be more flexible on their generation choices. And by that I mean a lot of these tech companies are suddenly saying, “Well, now we have license to get rid of our DEI programs. Now we have license to get rid of content moderation.” Maybe now it opens up the space for them to deploy a lot more gas without facing the same sort of social consequences. What do you think about how they’ll hold the line on this? They need everything they can get.
Jigar Shah: Well, I mean it all comes back down to affordability. The natural gas plants that they’re building in Louisiana are the most expensive solution they could have possibly selected. And the reason they’re doing it is because they think it’s faster, but it’s $2,000 a kilowatt installed. Remember natural gas used to be less than half that before. And then on top of that, all of the natural gas pipelines that serve them are full. So they have to upgrade all the natural gas pipelines, which sometimes takes three to five years to do. And again, those are going to be socialized to the people who are using natural gas on the grid. You don’t pay for the cost of upgrading the natural gas pipeline for your project. You just say, “Well, everybody’s fixed charge on their natural gas bill goes up by $10 a month.” And so everyone is subsidizing Meta in that case.
And I just think that, so it’s a hundred dollars a megawatt hour to be crystal clear. The new natural gas is still a hundred dollars a megawatt hour. It’s just fine, whatever. It’s just one of the more expensive things that they could do and it’s not actually going to reduce costs over time. Remember, natural gas prices have spiked over the first two weeks of the Trump administration. They’ve gone from around $2.50 a million BTU at Henry Hub up to like $3.50 to $4. We will see what happens to natural gas prices, but there were a lot of electric utility CEOs that got fired in the 2000s because of natural gas price volatility. And so moving from 42% of our grid being natural gas to 62% of our grid being natural gas, that is a recipe for mass firings of electric utility CEOs.
Stephen Lacey: So there’s a lot of activity now on the state level. Regulators are considering how to pay for this infrastructure. Entergy’s agreement with Meta in Louisiana includes minimum payments to cover new generation costs and sharing of current system costs. I believe American Electric Power in Ohio has suggested a rate structure requiring data centers to pay at least 85% of their predicted energy demand monthly, even if they use less. What is some of the activity that you think is worth paying attention to?
Jigar Shah: Well, I think you start by just saying that when you have to upgrade infrastructure, the normal way that that happens is it gets peanut buttered across all of the customers within that territory. So if you have to upgrade infrastructure for one project, everyone pays a little bit extra to pay for that infrastructure, particularly in the natural gas grid, which people are not as familiar with. But the natural gas grid is largely at capacity and a lot of those upgrades are going to increase your monthly bill for your home. And so I just want to make sure that everyone’s crystal clear about that.
Now, what’s happened in Google’s case is they’ve invented this structure called the clean transition tariff, and that’s what they’ve used in Nevada let’s say. And in that case, they can pay for a hundred percent of the cost of Fervo’s geothermal power. So they’re not on this special industrial discount rate, but instead they voluntarily agreed to pay for a higher rate that covers almost all of the incremental new costs that comes from Fervo building a geothermal facility.
But when I’ve talked to utilities and regulators across the country in my old job, a lot of folks were trying to tell me that that was illegal. I mean, I can tell you that there are 20 utilities that I talked to mostly in the Southeast who are like, “Jigar, I think that’s illegal. We’re going to have to offer everybody the same industrial rate.” And I was like, “I’m not telling you you’re going to force them to pay a higher price. I’m saying if they wanted to opt into a higher price, that’s not illegal.” And they’re like, “Well, I guess that’s true if they could opt into a higher price.” But that is the basic level that we’re at right now.
Katherine Hamilton: And I would just say that is patently false because large industrials are constantly having bilateral agreements with their utilities and deciding what on earth they want, what kind of deals they can cut. And of course it’s at the cost of all the other consumers that are just trying to allow their thermostat controls to work efficiently and often don’t even have programs to do that. So these large industrials are able to cut deals, are able to often get different rates. Of course, there’s still some places like in Ohio where the large industrials aren’t even allowed to do demand response, which is insane. And they’re hopping mad at their utility for that. They want to be able to bring whatever resources they have to bear to be able to manage their costs. And I am sure that these data center folks are going to want the same thing.
Stephen Lacey: So there is this base level 25,000 megawatts of new demand. I mean at the upper end, McKinsey thinks we could see 50 gigawatts of demand. And as you’ll hear in the conversation later, Peter Freed thinks that’s right, although that was recorded before the DeepSeek news, but still we’re going to need a lot of new demand because if modeling is cheaper, it’ll expand deployment and there’s going to be a lot of inference and we need a lot of data centers, but nobody knows how these companies are going to make money. And the question is, if we’re overbuilding infrastructure, how big is the bubble? Does DeepSeek challenge any of those assumptions? And if we are in a bubble, what happens in energy if we’re signing all these contracts and then these data centers are underutilized? Can you parse through that for me?
Jigar Shah: Well, look, I think as Katherine suggested earlier, we have had many periods of time in our country, particularly in the late nineties when regulators were conned into believing that we needed to overbuild all of our infrastructure. It was all customers, and it took a long time to then grow into all that infrastructure and to amortize it all. And so I think people are very clear that they don’t want to do that. AEP is the prime example. And I think a lot of people are copying them and saying, if you want us to do this for you, you will pay regardless of whether you use it or not. That is what they’re saying. And I think that every utility should hold the trillion dollar companies to the same standard. These are companies who can afford to pay it. So if they want something that is purely physics based, this is not like a vibe here.
Somebody’s got to put some steel in the ground, well then they should pay for that. So I think that is totally fine. I think it’s important though, to separate this from 85% of the load growth, which is all being done in ways that are far more manageable, like EV load growth or heat pumps or small manufacturing. A lot of that can be done with distributed capacity procurements like Xcel Energy Minnesota is doing with Spark Fund or with virtual power plants like you’re seeing a lot of folks do. And so there’s a lot of other solutions including grid enhancing technologies, next generation conductors. There’s a lot of technologies that can get more out of the infrastructure we already paid for. I’m just saying when you decide you’re going to build one to five gigawatts of load in one location, those other solutions are slightly less of a silver bullet. You actually need to build real dedicated infrastructure. And so at the time at which you decide that that is something you want to do and it’s important for us to beat China or whatever, great, just pay for it.
Stephen Lacey: Let’s get your bullish and bearish cases for the grid. Katherine, what’s your bullish case that the data center boom will help the grid?
Katherine Hamilton: Well, I dunno if it’s going to help the grid, but I think that the grid can help it. So just to go back for a second to when I was at Virginia Power, which is now Dominion, Old Town Alexandria was experiencing an incredible business boom. And there were high rise buildings going up all over the place, and I think most of them were being served at the time out of a switch and transformer in the back of a Wendy’s restaurant because there just weren’t enough substations to feed all of these plants. We had plenty of nuclear capacity, we had a pumped hydro plant, the largest in the world, but we couldn’t get to that last mile.
And what happened was the utilities started getting creative. We had thermal energy rates, we had time of use rates. There was the largest oil company had an ice storage system to manage all of their heating and cooling systems. So there were things that we could do that were innovative. Now I think that over the decades since then, there are a lot more innovations that are outside of the utility, and I think that’s what we’re going to have to rely on. The utilities can’t do all of this. Of course, everyone else has to come together.
But I think that in addition to all of these great technologies that Jigar and I have been talking about, there are also ways that data centers can help. So a data center, for example, there’s this company up in Quebec called Q Scale, and they build data centers. They use high density, liquid and air systems. They use heat recovery design, they have building design with energy efficiency built in. And I think you’re going to start seeing technology innovation on the data center side as well. And that will lead them to lower costs too as well as on the grid side. And I think the issue is going to be planning, it’s going to be the need for utilities to open up a little bit and allow others to come in and solve for them. And that’s been the issue all along is that utilities have obfuscated and tried to put friction points in the way of some of these new technologies. And I think that’s going to have to change.
Stephen Lacey: Certainly there’s a focus in the industry on demand response for computing and in both conventional data centers and in AI data centers, you can take certain kinds of compute loads and shift them for when demand is lower on the grid or shift them to when there’s more renewable energy supply and companies like Sidewalk Infrastructure Partners with their various platform and even Google and I believe Microsoft are playing with this. So certainly a lot of innovations on the data center operations side. Jigar, bullish case for this?
Jigar Shah: Well, I mean Katherine said it so well, the only thing I’d add is just that what this requires in the bullish case is a clear price signal and you’re starting to see it, but everyone just has to understand that there is no wiggle room. It’s a hundred dollars a megawatt hour. That’s what it all costs. Stop arguing with me, stop saying it’s going to be less. Stop saying things are going to be cheaper. It’s not. It’s a hundred dollars a megawatt hour.
And once you decide that that’s the number, then everything becomes easy because then you’re like, “Oh, I have this fancy new cooling system and it saves this much money if you assume a hundred dollars a megawatt hour. I have this fancy new approach to doing this and it’s cheaper because at a hundred dollars a megawatt hour, it’s this.” One of the things that’s the bane of our existence right now is that ICF continues to use 3 cents a kilowatt hour, $30 a megawatt hour for all of its calculations around energy efficiency and whether things are worth doing and all this other stuff.
But now when you see the contracts that Microsoft and Google are signing with Fervo and some of these other types of companies on demand response, it’s all at a hundred dollars a megawatt hour. Why? Because they realize that that’s their impact on the grid. And so if people can find ways of reducing their impact via these other tools and they can send a price signal to be able to do the work to bring all these nodes online, well then all this innovation that Katherine’s talking about can happen. But if everyone is like, “Well, I think I might be able to get it for free. Well, maybe I won’t have this impact on the grid, maybe I’ll still get the industrial rate, right?” Well then there’s no clear signal in the marketplace and everyone is basically pointing at each other saying, “Why would I bring all these new assets online when I don’t have a clear price signal and I don’t have a clear way of getting paid?”
And so the thing that’s happening in real time in front of our eyes right now is that everyone is saying the quiet part out loud. They’re like, “Don’t say that number.” I’m like “a hundred dollars a megawatt hour.” I’m like Beetlejuice, it’s like “a hundred dollars a megawatt, a hundred dollars a megawatt, a hundred dollars a megawatt hour.” And now everyone is like, “Okay, fine. If that’s the number, then we know what our marching orders are.”
Katherine Hamilton: So this Jigar leads to what I think Stephen was going to ask about the bearish case because unless we have some regulatory reform and unless signals are sent for all of these different technologies to be able to participate with parity, we’re going to be in a world of trouble because then you won’t be able to access all of these interesting new technologies. So you need to be able to open up both the wholesale markets to make sure that folks like thermal batteries are not just treated like load but are actually seen as a resource. You have to be able to bring everybody into the mix and be able to send them the price signal to be compensated for what they’re providing. Otherwise you’re going to have just chaos and you’re not going to have a very well planned out system. And of course, customers who are already worried about the price of eggs are really going to worry about the price of electricity.
Stephen Lacey: Jigar, what’s your bearish case? If you look across cost to rate payers, lower power quality, capacity constraints, fossil fuel expansion, what feeds into your bearish case?
Jigar Shah: So my bearish case is not so bearish. I think that we are now at a point where new natural gas is about 1700 to $2,000 a kilowatt installed, which is way more expensive than anyone thought it would be. And what people really want is capacity. So NextEra talked about that in their conference call in January. And so right now four hours of battery storage is coming in at $800 a kilowatt, so that’s way cheaper than 1700 to $2,000 a kilowatt for the same amount of capacity.
So in my bearish case, we’re going to build a hundred gigawatts of new battery storage, four hour battery storage, hopefully eight hour battery storage across the country, and you’re just going to run more coal and more natural gas. So all of those fossil fuel plants that are already operating will just run at higher capacity factors and they’ll just fill up all those batteries. So you’re going to get a hundred gigawatts of new batteries.
I don’t see how anyone’s going to pay the 1700 to $2,000 per KW for new natural gas. It’s just so expensive. And so the only people who will pay for it are people who can rate base it and they’re like, “Oh, we don’t care.” The IPPs, they’re going to be like, “No way. We’re going to get stuck with 1700 to $2,000 per KW natural gas plants that five years from now might be put out of business by some other new power generation source.” And so I think the bearish case is that we put a crap load of batteries in—technical term—and then we run a lot of our existing generation more, right? Whether it’s curtailed wind and solar or whether it’s the existing natural gas plants that are running at 30% capacity now, they’ll run at 60% capacity and they’ll just be used to fill up all the batteries.
Katherine Hamilton: And what you don’t want is a bunch of stranded assets if you have a bunch of data centers that the load is shrinking in as a result of new technology on their side.
Jigar Shah: Well, that’s why the batteries should just be placed at the data centers and co-located and so that way they pay for it.
Stephen Lacey: I find this topic endlessly fascinating, so I think it will certainly bleed into a lot of our conversations. This is a great time to mention that we’re going to be talking about this in June at a live show at Transition AI in Boston, so stay tuned for more details. In the second half of the show, I’m going to bring in Peter Freed, the former director of energy strategy at Meta, but first Jigar, good to see you. Thanks so much.
Jigar Shah: Man, I am already addicted to seeing you guys every week.
Stephen Lacey: Katherine, thank you.
Katherine Hamilton: Yeah, it’s a good thing. I agree. I’m bullish on this.
[SEGMENT 2]
Stephen Lacey: Let’s get another perspective on this from someone who’s been deep in the trenches on data center development. Peter Freed spent nearly a decade as Meta’s director of energy strategy watching the company evolve from building single data centers to managing multiple massive campuses.
Peter Freed: It is not an exaggeration to say that the industry looks entirely different today than it did 10 years ago.
Stephen Lacey: At our recent Transition AI conference in DC, Peter told me about what he calls the “wall” moment five years ago when Meta faced its biggest ever surge in power demand.
Peter Freed: And internally everyone said, “How are we going to get over the wall? How are we going to get over the wall?” And the team managed to get through it as they always do. The current moment dwarfs the wall. The wall is a tiny little hump that we’re now jumping over.
Stephen Lacey: Peter’s no longer at Meta. He recently left and founded a company called Near Horizon Group, which consults with utilities and tech companies. In our conversation, Peter walked me through some fascinating territory: why he thinks nuclear and geothermal could become critical, how the tech industry’s hunt for power is reshaping electricity markets, whether we’re actually in this AI infrastructure bubble. He also addresses a key question we’ve been discussing: Can the grid keep pace with AI expansion? So here’s my conversation with Peter Freed.
Peter Freed: First of all, I am extremely bullish on data center as a segment separate to the AI conversation. I think that the growth of our digital lives, the footprint of our digital lives with everything that is coming down will continue to grow and we now have this emergent issue of AI and training and inference and all of the different things that are increasing power demand where I think there are still a lot of questions on that in terms of what the trajectory of compute need and the associated load growth is going to be, but it’s still very likely that you put those two things together and we see ourselves on sort of an unprecedented level of demand from the sector.
I buy those McKinsey numbers by the way. I think that team is sharp. I think globally it could be higher than that potentially. We’ll see. And so how we sort of operate in this moment while we are also electrifying everything, as Scott pointed out earlier, we just spent call it two and a half years passing a variety of federal legislation to incentivize electrifying darn near everything we can and it’s working. And so how do we think about this moment in time where we’ve got the immediacy of the data center load growth coupled with a longer trajectory growth in electrification across the economy? Very interesting times.
Stephen Lacey: When we talked back in September, you said to me that you think that the craziest gen AI signals really hadn’t shown up in a lot of the capacity that we’re seeing developed. Now when you look at those McKinsey numbers, 50 gigawatts of demand by the end of the decade, what signals are you seeing now that lead you to believe that that is possible? And the question is, could we actually develop 50 gigawatts of data centers?
Peter Freed: So I think the first piece of it is that when we were talking in September, we sort of think about the things that are getting announced this year and just looking at the trajectory of announcements for the hyperscalers anyway, because sort of the ones that at least some of them talk about what they’re doing. We are seeing more announcements this year than we’ve ever seen any previous year. That also tends to be true historically though they’ve all been ramping up.
What I think is also true is that many of the things that are getting announced were things that were likely already in planning cycles. The data center planning horizon is every year they’re sort of refreshing whatever their projection is going to be and building to that. I think probably what is happening is we’re seeing things that would’ve been announced or deployed in 2025 from a break ground perspective are now being pulled into 2024.
But I also believe that we will find ourselves in a moment, particularly I think 2025 is going to be a very big year where many things that got kicked off in earnest in 2024 will begin breaking ground in the beginning of ’25 into mid ’25, which means that those loads really begin coming online in any sort of meaningful way. Let’s call it a three to four year deployment time horizon for a data center in 2027 and 2028. And so that to me is the beginning of an interesting clock that sort of says, okay, we’re going to see lots of loads showing up coupled with all that other economy wide electrification that begins at that period. And then we’ve got a window and maybe the backside of the window is when a bunch of new clean firm, nuclear advanced energy technology, what have you, shows up and we’ve got to do some interesting things inside of that window.
Stephen Lacey: I want to talk about how big or small that window is and what could fit through it, but I want to talk about the data center capacity a little bit more to flesh out the context. There is a lot of question about how big data centers are going to be and that will have very specific acute impacts on the grid depending on the size of campuses. So many people think we’re going to see these multi-gigawatt scale campuses. Some think that the industry will be limited to these 200 megawatt size capacity, conventional building blocks of data centers, or we could see even smaller clusters of data centers. What size do you think we’ll see and what are the impacts on the grid depending on that size?
Peter Freed: Yeah, super question. I am somewhat skeptical that we will see broad deployment of the mega campus design. For what it’s worth, the mega campus is not a new concept. All of the hyperscalers have chased it at some point. We’re talking kind of 800 to several gigawatt campus sizes, and the simple fact is it hasn’t ever taken off previously because that is a very, very hard thing to develop just from the perspective of getting electric utilities, water utilities, entitlements, enough land, labor force, all of the things that you need to do sort of a large scale infrastructure build.
And so the utility, the industry itself has settled into sort of this typical hyperscale data center building block size of about 200 megawatts. So that’s very common. You often see 400, occasionally 600 or a little bit more, but it’s usually in these 200 megawatt building block sizes. There’s a reason for that that’s pushing up against the limit of what is relatively easy to develop from the perspective of all of those things I just mentioned.
Now we enter an era where from the perspective of the generative AI model, people are concerned about training. And there is especially maybe 12 months ago, this idea that if you just put more GPUs into one place, you would have better training outcomes. But that wasn’t as far as I know, based in any sort of real computer science that was just sort of like a classical Silicon Valley like, “Hey, we use 30,000 GPUs to train the last model. What if we got to a million? What would happen?” Really? I think that was the conversation that was happening and now people are starting to ask questions about whether that’s true. In fact, I think we’re starting to see data that suggests that maybe it isn’t true.
So the first thing is do you actually need a million GPUs in one place? And my guess is that we are going to figure out how to crack distributed training. So you put GPUs in different places and you can still run large scale training across that. I’m not a computer scientist, but just chatting with folks that I know who are, I think a lot of people are working on that. So I think as soon as we find a way to computer science our way out of the problem of needing to put a lot of computer intensive resource in a single place, we will probably return to the mean of something like a 200 megawatt building block. It’s much easier to do.
Does that mean we won’t see any of these mega campuses? No, we will definitely see some. In fact, we saw this thing in Louisiana where Entergy and Meta are talking about a multi, or at least what appears to be a multi-gigawatt campus from the filing. I, by the way have no independent knowledge of that project, but just like an energy nerd dug into the regulatory filing and sort of saw what there was to see. We’ll see a few more of those. It’s not going to be none, but I don’t think it’s going to become the dominant archetype for deployment.
Stephen Lacey: Okay. So you’re working with utilities. What are the most common acute challenges that you’re hearing from utilities that you’re helping them solve?
Peter Freed: I mean, I would love to say that it was cool and innovative technology deployments, but the truth is it’s just figuring out how real this demand signal is. That’s the first thing. How do these companies think? What are they actually doing? And I will say I’ve been in and around the data center business for 15 years now, and I have not seen anything like the level of speculative behavior that we see in the marketplace right now.
The load side interconnection process, and frankly process is probably too strong a word to use here is not sophisticated and it hasn’t needed to be for a long time, right? Load has been relatively flat for 20-ish years, and so generally utilities wanted economic development that came with load growth. They were very happy to have people get into the system. The barriers to entry are low. There really isn’t anything in a lot of cases. The load side interconnection is just a spreadsheet on someone’s computer, and if you were a high economic development priority for the state, it’s pretty easy to get the top of that spreadsheet.
That doesn’t work in an environment where two people and their dog in a pickup truck are now deciding that they’re going to be data center developers. And so the utilities are just completely inundated with requests, especially in hot markets. And so they’re trying to figure out what to build to. So I think we’re seeing these projections, I can say with a high degree of confidence that those are going to be wrong, both because projections are always wrong, but also because I know for a fact just based on the things that I’ve seen in the marketplace that we are seeing duplicative requests.
And so I think first and foremost, the utility is trying to figure out what the heck is going on and then what to do about it. What do we build to, how do we know what to build? What do we put in our resource planning? How do we address specific constraints on the system and make sure that we are doing what our first and foremost responsibility is providing reliable power to all of our customers?
So that’s the first part of the work. And then secondly, they’re trying to figure out how to take advantage of this opportunity like everybody else using the resources that they have. Many utilities, we’ve all heard the stories of certain pockets of the grid that are not as interested in seeing more data center right now because they are pretty inundated, but there are many utilities in the United States that would be delighted to have a data center come. They know that it’s going to be good for the local communities. I think we’re seeing this in Louisiana with this large Meta project that is bringing resources, tax base employment into communities that are disadvantaged and otherwise. So I think we’re going to see more deployments in markets that maybe historically haven’t had data centers hunting after them, but that’s certainly going to be something that we see moving forward.
Stephen Lacey: So if we have this window from 2027 to the early 2030s to serve this new demand, what technologies do you think will be able to move through that window? What clean firm power options are you focused on? I know you have a particular focus on nuclear and geothermal.
Peter Freed: Yeah, well, so let’s talk about nuclear for a second. So I imagine clean firm in general is sort of the other side of that window, and I’m doing a lot of work personally to try and accelerate deployment time horizons for nuclear, particularly large scale nuclear actually. I think there’s quite a lot of interest in SMR, but given the scale of what we’re needing to do, just building a bunch of AP-1000s is not the worst idea. Most scenarios, though everything works out perfectly, we’re still talking 2032 and on.
And so if you sort of think about this period 2027 to 2032 where we’re going to have to make some interesting decisions. So the first, and by the way, we have to be working on nuclear and other clean firm technologies now to make sure that they’re available then. The development time horizons, technology deployments and things like that, they take time.
But also we’ve got this more immediate term need and figuring out how to serve power. And so the first thing that I think about is how do we use the grid that we have better than we currently use it? And there’s a suite of technologies, grid enhancing technologies, et cetera that we’ve known about and have been hard to deploy from an economic perspective for a long time. These technologies are existing, they’ve been around for a while, and yet we don’t see significant adoption. This moment of tremendous load growth and the economic abundance that comes along with it does create very real opportunities to unlock the potential of these technologies for deployments if we can draw a clear and meaningful line back to the ability of those technologies to deploy additional capacity for data center or whatever else people are building.
And so I think this is an interesting thing. We still have a little bit of a translation problem, so utilities aren’t necessarily thinking about this stuff. The companies that are looking to develop data centers aren’t necessarily thinking about it either. So how do you figure out ways to say, “All right, the deployment of this particular technology in this location would unlock 300 additional megawatts of data center capacity?” That’s kind of where we need to start. And then all of the other considerations come from there.
I’ve been looking at some really interesting analysis from both Rhodium and McKinsey looking at headroom inside of the existing natural gas fleet in the United States. We’ve got somewhere between a hundred gigawatts and 200 gigawatts of available gas energy, not capacity, but energy on the system that we can play with and it’s planned for, right? So this doesn’t mean that you can get away with using it just as you were, but there are a lot of electrons available.
And if we start thinking about the specifics of addressing capacity constraints, either through flexibility in the load or the deployments of batteries or other technologies that are available today, we can unlock some of the available capacity that’s already there to power these facilities much sooner. So that’s the first thing. I think just use the grid you have better than we currently use it. It might get us all the way there, hard to say. And then also we need to make sure that we’re not losing track of what we will need as soon as possible, which is definitively more clean firm as we look not only to think about how to deploy more data center, but how to decarbonize the electricity sector as a whole, which is something that I spent a lot of time thinking about.
Stephen Lacey: How significant is the demand signal from data centers for the nuclear industry? You said you think we should be building a bunch of AP-1000s, but we’re seeing order books for small modular reactors. We’ve got the recommissioning of existing plants. We’re sort of exploring a variety of large scale developments. What models do you think will win out and how significant is that signal from the tech industry and the data center industry for nuclear?
Peter Freed: Yeah, I think the tech industry likes nuclear because you look at the demand profile of a data center—functionally these large block loads—and you look at the generation profile of a nuclear plant, it’s clean, it’s firm, it’s very high uptime. And so I think there’s just a natural fit between those two things, two sides of the same coin.
So I think that’s part of the reason why folks like it. And mind you, these are companies who have deployed tens of gigawatts over a hundred gigawatts of variable renewable resources with storage. So they understand that that’s important too. I don’t think anybody is abandoning those targets, but they’re looking at the same modeling that I’m sure many of the people in this room are looking at, which suggests that we can’t get to a fully decarbonized electricity system without clean firm resources. I think I haven’t seen a credible model in some time that suggests that we can.
And so you start thinking about how to do that. So my view on this is that it’s in terms of SMRs versus large scale versus micro, I think it’s all of the above, but because of the consideration that I was talking about with the window, my primary focus has been on speed and scale, which to me suggests deploying the larger scale reactors. We’ve seen the restarts, which are great. We’ve seen obviously the Microsoft Three Mile Island announcement, which I think is fantastic. There’s also a plant that Holtec is restarting—the Palisades nuclear plant. There’s really only one more plant that we can restart, so that’s not going to be a broad opportunity.
We’ve got uprates, so that’s taking existing plants and juicing a little bit more generation out of them through technology improvements. There’s something like eight gigawatts of additional capacity that we could do through uprates. I think that’s a high priority. And then we’ve got the new stuff and the new stuff. Look, I mean, a large scale nuclear deployment in the United States is going to be very costly and it’s going to take signals, demand signals from the marketplace in order to justify that. The largest customers who are likely to be willing to do it are going to be tech companies. And so I think there is a role for them to play there, particularly if they’re thinking about market transformation. So hopefully in addition to the things that we’ve seen already, we will also see some commitments to large scale nuclear deployment, which I think could be really transformative in terms of moving the grid to a reliable decarbonized grid of the future.
Stephen Lacey: What’s your outlook on geothermal, and is the path to scaling geothermal for data centers more or less complicated than nuclear? It seems like it would be a lot simpler, but what are the limitations?
Peter Freed: I love geothermal. I think it is sort of the clean firm technology that people talk less about. It’s certainly simpler and probably cheaper than building new nuclear, at least in the beginning. The challenge is just geography. So Fervo is a company which gets a lot of attention for demonstrating that you can generate power with a dry well. So you’re pumping fluid down, you’re generating power in that way. They’re still, in my understanding, mostly limited to sort of existing geothermal basins, Mountain West, et cetera.
There’s a company called Sage Geosystems, which Meta had announced a deal with recently this year that got started when I was still at the company, which I like because it pushes the geography out. So it’s not the whole country, but you are getting east of the Rockies, which is an important development to the extent that you are looking to deploy into markets where that’s an acceptable technology—like great. So I think it would be great to see more of that. There is a far future in geothermal that they call super hot rock, which for lack of technical expertise just basically means drilling super, super, super deep. And you can do that anywhere, but that’s still longer out than new nuclear.
Stephen Lacey: Let’s talk about the regulatory piece. There are a lot of states that are grappling with how to pay for network upgrades with an emphasis on getting data centers to pay for all or most of the network upgrades. What do you think the appropriate model is when there are larger system benefits to bringing this clean firm generation online?
Peter Freed: This is a question which is top of mind for both customers, regulators, utilities, all of the above, at least in rate regulated jurisdictions, but also elsewhere. The first thing that I would say, because this is what I did at Meta for 10 years and continue to do, is let’s make sure we don’t lose sight of the existing utility and regulatory toolbox. We may be using some of those tools in new and creative ways, but they are there. They’ve worked well for a time.
And so I’m thinking about very basic regulatory principles of cost causation, cost allocation. How do we think about making sure that we are assigning costs appropriately to end use customers to the rate base, et cetera. So there are things we can do, and I also think that there are not really data center customers who are out there today saying, “Hey, I would really like to take advantage of aggressive cost subsidy, cross-subsidization in ways that historically we haven’t.”
I think for the most part, people are willing to pay their fair share. I also think one of the things that’s happening though, and I see this showing up all over the energy sector, is people look at tech companies with their large balance sheets and they sort of say, “Hey, those are really large balance sheets. You all should maybe pay for everything.” So maybe not your fair share. Maybe all of the share. To which I would say tech companies didn’t get those balance sheets by being dumb with their money. And so I think we have to find a middle ground here.
The other thing which is also true is that many, if not most utilities in the United States are under very severe upward rate pressure. And so while it is true that historically utilities have desired to put as much as they possibly can into rate base, I actually don’t think that that’s the case in many markets in the United States right now, which to me suggests we may need some new and innovative models, particularly around financing and partnerships that allow us to deploy infrastructure at scale outside of the traditional utility rate-based construct, because I just don’t think rates can bear it.
So one, think about how to allocate cost appropriately. Make sure you are recognizing the ways in which new infrastructure deployments do benefit networks. So this isn’t all going to be direct assigned to customers. There is network benefit associated with this stuff, and you probably do it differently whether you’re thinking about wires or generation or some other things. And two, I don’t think that the existing rate base is going to bear the scale of infrastructure build out that we likely need to see, which suggests to me that we need to look at some new and innovative financing models, likely through partnerships with outside sources of capital.
Stephen Lacey: So this brings us to a bigger policy question, and I have to ask you about the current political environment. At the start, I indicated that we are hearing from people in this space that the Trump administration through its deregulated approach, will provide tailwinds for the space, both for fossil generation, for renewables and clean firm resources and for broader grid expansion. What do you make of the policy agenda of the Trump administration, the people he’s surrounding himself with, the kinds of priorities they’re pushing and what it means for this space in particular?
Peter Freed: This is a small question. Look, I think it’s a little bit too early to say, but I think a couple of things are true. First of all, I wasn’t joking when I said that. I think that this moment in time with respect to load growth and all of the things that it can do for our grid are a serious national security opportunity if we make it such that will resonate with the current administration as it should resonate frankly with any administration.
But nonetheless, I also think that some of the technologies, particularly on the clean firm side, nuclear, geothermal, et cetera, have felt somewhat less partisan in nature than traditional variable renewables. So we’ll kind of see how that plays out. I’m keeping a close eye on what happens with the IRA. I love some of the work that’s been coming out, particularly from Rhodium showing something like 80% of IRA dollars are flowing into Republican districts across the country.
I think that’s powerful messaging. And then on the—you pointed out at the opening—the deregulatory emphasis will be really interesting. There are ways that that could be useful. There are ways that could be less useful. And so personally just keeping an eye on where that’s playing out and how I think ends up being meaningful. I am not throwing my hands up in existential dread though. I think that there will be a variety of ways that the industry will continue to develop.
I think framing AI as a national security imperative, making sure that the US remains dominant in the AI arms race, particularly the race towards AGI, which is driving some of this work for the very large campus design. I think that’s going to continue to be thematic in this administration. And so I think at least for data centers, separate to the energy conversations, I think there are strong tailwinds on data centers. And then how the energy sector sort of keeps up and continues to play and how clean that is… maybe some open questions.
Stephen Lacey: Peter Freed, thank you so much. I really enjoyed this.
Peter Freed: Thanks for having me. Great to be here.
Stephen Lacey: Open Circuit is produced by Latitude Media. Jigar Shah and Katherine Hamilton are my co-hosts. You can follow them on LinkedIn or BlueSky. The show is edited by me, and Sean Marquand is our technical director. Anne Bailey is our senior podcast editor. For more in-depth reporting on the topics we cover on this show, sign up for Latitude Media’s Daily, Weekly, or AI Energy Nexus newsletter. Go to latitudemedia.com and hit subscribe. You can find this show anywhere you get your podcasts. You can find transcripts at Latitude Media. And if you want to send us some show ideas, we do really want to hear from you, send a note to editors@latitudemedia.com. We’ll catch you all next week.


