Filings were due last Friday in the advanced notice of proposed rulemaking on standardizing large load interconnection. And in the final days before the deadline, hyperscalers, utilities, state regulators, and reliability authorities flooded the Federal Energy Regulatory Commission docket with nearly 200 comments, weighing in on the jurisdictional dispute, cost allocation mechanisms, and reliability standards.
Together, they outline a key tension between the AI industry’s demand for speed to power, and the grid operator’s mandate for stability. The central divide is whether commenters see AI data centers as national security assets that require a federal fast-lane — or else see the grid itself as the asset, put at risk by a rush to accommodate unproven loads.
The comment deadline is the first major milestone in the proceeding, which kicked off last month when Energy Secretary Chris Wright invoked a rarely used legal authority to direct FERC to consider standardizing the load interconnection process nationwide. FERC has historically regulated getting generation connected to the grid, but load interconnection has been left to states. Wright, in his proposal, argues that setup is no longer cutting it, given the unprecedented surge in demand from data centers serving artificial intelligence.
He proposed 14 “principles” to govern FERC’s rulemaking process on the issue, including requiring large loads to pay for the full cost of network upgrades, limiting the study process for such loads, and defining large loads as those above a threshold of 20 megawatts. (Nearly all stakeholder comments agreed that 20 MW was far too low a threshold, and risks clogging the queue with non-critical loads. Meta, for example, proposed raising it to 75 MW.)
But with the filings now in hand, it’s clear that the debate before FERC is coalescing around a few key flashpoints: namely who should get priority access to the grid, and how to pay for it.
As expected, the principles often pit tech companies and grid operators against each other. But there’s also disagreement within the hyperscalers themselves.
The federal jurisdiction question
Filings from OpenAI, Microsoft, Google, the Data Center Coalition, and the Clean Energy Buyers Association all agree that the current interconnection process is hampering AI growth, and poses an existential threat to U.S. leadership in the sector.
But they’re divided on the key question of whether FERC should preempt state-regulated interconnection processes.
OpenAI is proposing a “national interest” designation for loads greater than 250 MW — a super high threshold that would limit the designation to only the very largest projects. Such projects should get a federally mandated fast lane, the company argued: “Transmission providers should then be required to expedite the interconnection of these designated projects through a fast track process.”
The other AI giants are more lukewarm on Wright’s proposal, and more wary of heavy-handed federal mandates.
The Data Center Coalition, in its filing, urged FERC to take “a measured approach that recognizes the historic balance of federal and state roles.”
Meta, meanwhile, warns against implementing a “one-size-fits-all regulatory solution” nationwide and instead endorses the development of “guidance, best practices, and, if appropriate, minimum standards” to encourage more uniformity in the processes used by utilities, grid operators, and regulators. Citing the Trump administration’s own “AI Action Plan,” Meta argues that a detailed standard rule “that fails to account for the diversity in the economic landscape” could actually hinder the interconnection process “and undermine the Commission’s goal of bringing more data centers online faster and in a more orderly manner.”
Amazon is less explicit in its filing, but thanks DOE for “initiating discussion” on large load interconnection “in a manner that respects jurisdictional boundaries.”
On this topic the tech companies are largely aligned with other stakeholders — including the National Association of Regulatory Utility Commissions — who are explicitly opposed to FERC asserting jurisdiction over large load interconnections. NARUC’s filing warned that federalizing the queue interferes with state authority to protect residential ratepayers from the cost of the AI race.
Footing the bill for network upgrades
Another element of Wright’s proposal on which the hyperscalers don’t appear fully aligned is the question of whether large loads should pay for the full cost of network upgrades.
Google and OpenAI appear most aligned with the DOE proposal, endorsing the plan to base transmission service charges on withdrawal rights. Google also offered an olive branch to utilities worried about “phantom” projects in the queue by endorsing a framework that would require large loads to sign long term contracts and pay minimum demand charges.
For more on Chris Wright’s proposal, listen to former FERC commissioner Allison Clements and Duke researcher Tyler Norris on a recent episode of the Catalyst podcast:
Microsoft, for its part, is willing to front the cash for network upgrades in order to speed things up — but insists that they be reimbursed over time for the benefits that a line provides to the rest of the grid.
Utilities, meanwhile, argued against directly assigning upgrade costs to data centers, instead preferring “rolled-in rates,” where costs are socialized across ratepayers. It’s a logic rooted in the utility business model; if tech companies pay in cash for a line, a utility can’t then put the asset in its rate base and earn a return on equity. PJM’s transmission owners, including AES, Duke, and Exelon, among others painted a clear red line in their filing: “Transmission owners…cannot be forced to construct and operate vast portions of the transmission system without compensation.”
Curtailable loads jumping the line
Yet another point of disagreement is whether large loads that agree to be curtailable during times of grid stress should receive an expedited study process, as Wright proposed.
It’s a topic that’s been much-discussed by stakeholders the AI-energy nexus in recent weeks, after the PJM Independent Market Monitor released a report attacking the premise of voluntary data center curtailment, labeling it “regulatory fiction.” The IMM also argued that data centers shouldn’t be allowed to skip the line unless they bring their own generation.
In contrast, OpenAI offered strong support for the Wright proposal, arguing that flexible loads “should be given interconnection priority over inflexible loads” because they enhance system efficiency.
Meta, meanwhile, opposed the framing, arguing utilities shouldn’t limit the expedited process based on curtailability. Such a limitation, Meta wrote in an echo of the PJM IMM, “would be incompatible with the operational requirements of many data centers, which often allow for little downtime over the course of the year.” The types of backup power those data centers have for emergencies, the filing added, “can be subject to strict regulatory and operational requirements when used for non-emergency deployment.”
Grid operators including PJM are also skeptical. Creating a separate queue for large loads would be an administrative nightmare and could ultimately cause more delays, PJM wrote in its filing.
Colocation and hybrid facilities
Also playing out in the filings is the technical disagreement over how to model data centers that bring generation with them.
Hyperscalers argue that pairing data centers with new colocated or “electrically proximate” generation reduces the net draw on the grid, and that they should therefore be treated as a single resource whose net impact on the grid is minimal, thereby streamlining interconnection studies and lowering upgrade costs.
It’s an approach that OpenAI explicitly endorses in its filing, as does Google. Google focuses specifically on the point of “electrically proximate” generation — arguing generation doesn’t necessarily need to be on site, or colocated, to count. If a data center and a power plant share a point of interconnection, Google argues, grid operators should consider the “net flow” of the pair as a single, hybrid unit. If a 500 MW data center is sitting next to a 500 MW power plant, the net flow is zero, the hyperscaler argued, and therefore the project should skip the queue (or, as Google proposes, be entitled to an expedited study of 90 days).
Meta, meanwhile, speculated that such hybrid rules could ultimately restrict operations if not carefully navigated. “The Commission should clarify that large load and generation sharing a common point of interconnection do not require additional regulatory hurdles solely due to their co-located configuration,” Meta wrote in its filing.
Utilities fundamentally reject the “net impact” theory in their filings, arguing it ignores the physical reality of how power grids work. The grid must be built to handle the gross load, not the net load, they explained.
Even if a data center is “net zero” most of the time, that large load leans heavily on the grid for other services — like frequency regulation, voltage support, and backup power — that it isn’t paying for. And if the 500 MW power plant trips offline, the data center doesn’t shut down, but instead instantly relies on the grid.
“While it may make sense to coordinate studies, load and generation should be studied separately in the event the system needs to accommodate the full amount of the on-site load, or on-site generation, without the other,” the PJM transmission owners wrote.


