When President Trump was elected last November, there was a sense that his presidency might speed the deployment of infrastructure to power data centers and AI.
While both he and President Biden agreed that the U.S. should lead the world in artificial intelligence, Trump was prepared to lift all safeguards to make it happen. Farewell to NEPA, the Endangered Species Act, and any hope for an AI safety framework, and hello to more gas-fired power plants and unchecked expansion of all AI capabilities. His administration made powering AI a matter of national security.
The results have been mixed. Trump is indeed rolling back environmental regulations that have complicated infrastructure development, as promised. But the pause on deploying IRA and infrastructure law funds has held up projects that could provide much-needed power to data centers.
Meanwhile,DOGE and Elon Musk have spent months fundamentally reshaping the federal workforce, including at the Department of Energy. And Congress is presently wrangling over reconciliation, which threatens to walk back key Inflation Reduction Act provisions that have encouraged the build-out of new renewable capacity.
At the same time, it’s regulatory bodies like FERC and public utility commissions that are on the front lines of the data center boom. They are the ones actually making decisions about whether utilities can raise rates to fund data centers, or questioning utility projections for load growth.
Georgia, for instance, is at the frontlines of load growth from both electrification and data centers. The utility Georgia Power is projecting up to 9.4 gigawatts of new load in the next decade as a result. But the state’s Public Service Commission is pushing back against that forecast, citing concerns about underlying assumptions, model specifications, and input data that staff say may lead to overestimations of demand.
How much energy infrastructure Georgia Power — and its many peers grappling with surging demand throughout the U.S. — should plan to build will ultimately depend on the intersection of what the market supports, and what the policy and regulatory landscape encourages.
To take stock, I sat down with Tanya Das, the director of AI and energy technology policy at the Bipartisan Policy Center, and Nic Gladd, a partner in Wilson Sonsini’s energy and climate solutions practice, at Latitude Media’s Transition-AI event earlier this month.
Below is an excerpt of our conversation, edited for brevity and clarity.
Nic, when we spoke last week, you mentioned that you’re hearing a lot about the U.S. moving too slowly to encourage infrastructure development, both for data centers and the power resources that serve them. What do you think the primary problem is with the timeline mismatch between infrastructure development needs and current processes?
Nic Gladd: The speed to power is an imperative. The market needs data center infrastructure now. There’s a supply constraint problem on the generation side of the industry that’s been a decade in the making, with the backlog in interconnection queues. So there’s a fundamental mismatch between supply and demand, which, from FERC’s perspective, means rates will increase dramatically if something doesn’t give.
So most of my clients aren’t looking at the front-of-the-meter options that are eight-plus year lead times depending on the utility that you’re interconnecting with. They’re looking at the more bespoke, creative ways to get there quickly. The problem is, from a regulatory perspective, that’s new and different, and FERC is uncomfortable with that. The proceedings that are most urgent for the commission right now are the ones that are driven by market participants like the Talen-AWS deal, who are trying to go to the market with something new and quick and that gets around those supply constraints.
Parts of FERC’s regulatory framework are flexible enough to accommodate that — the Federal Power Act is a wonderfully malleable statute — but it takes some political courage. It takes doing something new, and most regulatory bodies are not in that business. So right now, the problem is getting those new ideas to proliferate, and getting the regulators to respond to those transactions that are being negotiated by sophisticated buyers and sellers. And so far that’s not going well.
How are Congress versus the executive branch treating this timeline mismatch differently?
Tanya Das: The federal government I think is really grappling with the fact that so much of this really needs to be worked out at the state level. I think there are really important things that the federal government can and should do to address issues related to data center load, but it’s a little bit limited.
It comes down to things like transmission and permitting build out, and those are things that the federal government is focusing on. The Trump administration is trying to do what they can through executive action. Beyond that, the executive branch is focused on trying to open up federal lands for citing data centers; you see the Department of Energy put out a request for information on that topic. They’re continuing to try to accelerate commercialization of technologies to power data centers, but I think that’s going to be a long-term solution and not a near-term solution.
The Department of Energy, interestingly, is also focused on trying to address some of these regulatory issues. They have started a partnership looking at AI for permitting reform and AI for reforming the interconnection process. So I’d say the executive branch is trying to move pretty quickly.
At the moment, in Congress, all the oxygen in the room is sucked up by the reconciliation debate. So in the long term there, later this year, we might see some action on the funding side. We might see Congress provide some resources to help with various pieces of this. Hopefully Congress will turn to passing a bipartisan permitting bill; I think that is probably likely to happen after we move past reconciliation. So Congress is moving a bit slower than the executive branch, but I think they’re going to do what they can. But really a lot of this, I think, is playing out at the state level.
For companies looking to develop AI infrastructure, what are the key policy and regulatory considerations that you often see overlooked, but that can derail or delay projects?
Nic Gladd: No one overlooks interconnection, but what people overlook are the weeds of the process. They understand interconnection is a challenge; it’s an existential project development risk. But it’s actually a process; it’s a series of stacked risks, both financial and timeline risks that require active management. I can’t tell you how frequently I get calls like ‘We thought we were on top of our interconnection application, thought everything was in good shape, and we missed this really weed-sy, hidden provision of a tariff, and now we need you to get us out of jail.’ Like it happens way more than I would like. And that’s not a good way for civilization to provide central public goods.
Another is curtailment. There’s a lot of debate about “five nines” power reliability. There’s a blind spot for a lot of the customers who are looking for that, of how do other interested parties view curtailment, and how do they view five nines reliability? When the rubber hits the road, what does that actually mean? At some point when the system is extremely constrained and stressed and load starts to get curtailed, the load serving entities and FERC are not going to want to shut off mom and pop; they’re going to want to shut off the big loads to keep the lights on for the voters. So understanding the regulatory requirements around curtailment I think is very much under-appreciated, and has almost derailed multiple deals for me in Texas.
Tanya Das: One piece of this puzzle that I think we need to talk about more is how there are many different types of data centers and their ability to serve as flexible loads are all very different. So we have colocation data centers — which are different from colocated power — that have multiple customers, in which customers rent out time within their facilities and rent out equipment within their facilities for their own specific business needs. We have the hyperscaler data centers that we all are familiar with, which are focused on training and inferencing for large AI models.
And those data centers that have multiple customers, they find it very hard to serve as flexible loads; can you imagine coordinating with 100 different folks who are all renting time on your facility and saying, ‘Hey, can I scale down my operations for you during this time of day? It’s too much. And the vast majority of data centers on the grid right now is in that digital economy function. And so when we’re talking about data centers serving as flexible loads, that’s something that I’m really skeptical that we’re going to see in the near term. But in the long term, I think maybe there’s more opportunity as the proportion of AI data centers begins to grow.
The other thing I’ll flag is on the generation piece. This is sad news, but I hear a lot of excitement around nuclear energy showing up as the solution to powering data centers. But just looking at commercialization timelines for advanced nuclear, I’m very optimistic in the long term, but again, I think that’s going to be something that we see in the 2030s. And some of this is related to policy and regulatory timelines and NRC approval processes and all that, but also it really is just a technology development problem and that I don’t think there’s a lot that we can do to actually speed that up in the short term.
What do you see as the biggest change in how FERC and state-level commissions are thinking about major energy consumers — especially data centers?
Nic Gladd: The biggest change is at FERC they’re now aware that there’s data center load out there, which is a nice change.
All joking aside, they historically had no idea what was happening on the load side. They would take demand curves, and take load as an input into that. They were focused on where supply and demand meet in their organized markets, and beyond that they weren’t concerned about where the load was coming from. Now data center load is the central policy consideration in pretty much every high profile FERC proceeding.
So there are several dockets open right now at FERC where data center load is the animating force behind some of the biggest priority proceedings of the commission. For example, any proceeding that involves resource adequacy, even in the most tangential way, implicates a policy debate about data center load. It is, I’d say, the biggest show in town at FERC.
At the state utility commission level, they were paying attention historically, but it was usually just rate design proceedings. They wanted to know what load was coming in, and what they could do from a rate design perspective to help facilitate and bring economic development. They weren’t thinking about the broader implications of that. Now, various states are fully engaged, and some states are still not engaged, but they’re all aware. They’re all monitoring what’s happening, if for no other reason than that what happens at the FERC level will flow down to their customer rates that they’re going to have to approve, and the political buck will stop with them when the affordability is too much to swallow.


