There’s a lot to worry about when a data center developer comes to town. What does it mean for local land and water use? Will this development really create lasting jobs? Will it drive up emissions if renewables aren’t available? And what about the noise?
And one of the biggest and most critical questions for utilities: will data center demand rooted in the artificial intelligence boom drive up electricity rates? In a number of recent earnings calls, including at DTE and AEP, the latter was largely answered with a “no”— or, even better, “it will drive them down.” This feels a bit too good to be true, so it’s worth taking a hard look at the issue and how it’s being addressed across the country by regulators, legislatures, and utilities.
The dynamic at work is fairly straightforward, but like pretty much everything in power markets it’s anything but simple in reality. It boils down to how the costs of supplying the power, energy infrastructure, and network infrastructure are allocated. Utilities argue that these large data center customers help spread fixed system costs across a larger demand base, potentially reducing per-unit costs for all customers. However, if that data center needs a new substation and transmission line, the question becomes whether other customers should bear any of those costs.
How to allocate costs is a matter of ongoing debate. From a utility operations perspective, the predictable baseload of a large data center can be more efficient to serve than variable residential demand. However, the sheer scale of power requirements can strain existing infrastructure and require significant upgrades to transmission facilities. And utilities may be right to worry that AI customers may not be around for years, and leave the assets stranded.
Also, if the hyperscaler is looking to source clean energy for their new data center, that can drive additional transmission costs, since renewables are often more difficult to site near data centers.
Utility and energy company approaches to using data centers to keep rates down vary widely — and it’s becoming clear that regulators are worried that more may be needed to ensure costs are allocated properly.
State-level action
At the state level there is activity both in legislatures and utility commissions to either set new rules for large data center cost allocation, or else study impacts of these data centers on the grid. Here are a few to watch:
Texas
Texas, naturally, is taking a big swing with a new bill called SB6 that addresses challenges posed by large loads, such as data centers and crypto mining facilities, in ERCOT.
Its key provisions include requiring large energy users interconnected after 2025 to pay retail transmission charges based on their peak demand, regardless of co-location arrangements; directing the Texas PUC to establish interconnection standards for facilities with a demand of 75 MW or more; and compelling large load customers to provide proof of financial commitment. Other elements of the bill protect against duplicative requests in the interconnection queue, and give ERCOT some control over the generator in the case of certain events, like load shedding.
As someone who used to work for the Texas legislature, I can tell you that when a bill has this low an identifying number it means it is at the front of the queue and will get priority treatment.
Virginia
Worrying that demand will far outstrip supply, Virginia lawmakers put forward two bills to address data center concerns: HB2101 and SB960. However, in what is perhaps evidence of how tricky and intractable this issue is, both have already failed to pass.
HB2101 would have mandated the State Corporation Commission to investigate whether non-data center customers are subsidizing data center costs. If subsidies were to be found, the bill mandated implementing new rules by January 1, 2026, to mitigate them. It also temporarily halts new projects benefiting data centers until a potential investigation concludes. This bill was killed in subcommittee, with a vote of 5-0.
The Senate counterpart to HB 2101, SB960, would have directed the SCC to ensure fair energy cost distribution and protect non-data center customers from bill increases tied to infrastructure investments for data centers. The bill initially passed 26-23-1 in the state’s Senate, though it ultimately failed to pass in conference committee with the House.
Meanwhile, two additional bills — HB 1601 and HB 2035 — would require local assessments of environmental and community impacts before permitting new data centers and mandate public reporting of energy and water usage. HB 1601 has cleared the house, but HB 2035 has failed to make it out of committee.
California
In California, SB 57 proposes a special rate structure for data centers to prepay expected energy consumption, ensuring grid investments are fully recovered while protecting residential ratepayers from increased costs. It also focuses on aligning data center operations with renewable energy goals.
Meanwhile, SB 58 would provide a tax credit to data centers that use at least 70% carbon-free energy, including at least 50% from behind-the-meter sources. (This comes with the caveat that a qualifying data center couldn’t use any diesel fuel or recycled water cooling within five years of certification date.)
These bills are being considered against a backdrop of utilities speaking plainly about the shift underway.
Just last week, PG&E announced in its quarterly earnings call that thanks to new load from data center operators, they’ve estimated that for every new gigwatt added, rates could fall from 1% to 2%. They’re making the case for increased utilization here — and they’ve paired it with a request for a new tariff, called Rule 30.
The tariff, which is currently under consideration by the California Public Utilities Commission, is designed to streamline transmission-level large-load connections and allow upfront customer funding for infrastructure other than transmission network upgrades. The premise is that hyperscalers could be reimbursed once the load becomes operational, so existing customers don’t bear any costs if the projects fail to materialize.
Georgia
Georgia’s Public Service Commission is weighing a new rule that would require large-load customers, including data centers using over 100 MW, to pay transmission and distribution costs associated with their projects. The change would mean that contracts with Georgia Power for such customers must be reviewed by the PSC, and ideally would prevent existing customers from bearing additional costs due to large energy demands from new facilities.
A bill in the Senate, SB34, would add further protections for ratepayers, but it’s uncertain whether it will pass.
Ohio
In 2024, AEP Ohio proposed new utility rates requiring data centers and crypto mining firms with loads over 25 MW to pay demand minimum charges. These would fall under a 10-year commitment based on usage forecasts, before AEP builds any infrastructure to serve them.
The proposal aims to ensure that residential and lower-income customers are not burdened with stranded costs. It would increase the minimum charge to cover 90% of forecasted demand, even if they use less, up from 60% in the current tariff.
Indiana
The Indiana Utility Regulatory Commission recently approved modifications to an industrial tariff for Indiana Michigan Power so that customers with power requirements of 70 MW at a single site or 150 MW across their company face strict rules around contracting and financing their electric service requests. It’s an update to an existing industrial tariff and has data center operators in mind. Like other tariffs for large loads, this update requires upfront financial commitments and is designed to protect ratepayers from paying for stranded assets.
The decision — driven by a settlement agreement involving Indiana Michigan Power, Amazon, Google, and Microsoft, the Data Center Coalition, and consumer advocacy groups — is only a first step and will be followed by development of cost allocation methodologies.
FERC and the colocation option
Some hyperscalers are looking to make an endrun around the whole process by colocating with generation, often with behind-the-meter arrangements for sourcing that power.
Placing generation, including renewables and storage, at the same site as a data center can speed development while also keeping the hyperscalers from paying for additional transmission service. So far, however, colocation is a fairly niche practice — and both FERC and many utilities aren’t quite convinced it is the right way forward for the whole industry. Work is underway to create a clearer set of rules for data center operators.
FERC launched its colocation review on February 20, with a focus on PJM.. Responding to both a complaint from Constellation Energy, and input from last year’s technical conference, FERC asked PJM and its transmission operators to respond within 30 days to several dozen specific questions on co-located loads and their impact on the transmission system.
This is an ongoing proceeding, but early indications suggest that FERC doesn’t agree that data center operators should be exempt from paying for the benefits of being connected to the whole power grid. And as a result, IPPs seem to be worrying that this level of uncertainty will push data centers away from power markets like PJM — and toward regulated utilities where the rules can be much clearer.


