In the spring of 2024, the Department of Energy published its tenth “liftoff” report, outlining a pathway to scale for grid solutions including advanced transmission and grid enhancing technologies like advanced conductors, dynamic line rating, advanced power flow control, and topology optimization.
These “innovative grid deployments”, the report found, have the potential to increase transmission and distribution capacity in the U.S. to support up to 100 gigawatts of incremental peak demand. DOE laid out four pillars for achieving commercial liftoff: building a bank of evidence, developing operator know-how, refining the cost-benefit pitch, and aligning economic models and incentives.
Today, thanks to massive load growth from data centers, there’s been an uptick in interest in these technologies: everyone from the White House to state legislators are considering their potential to squeeze more out of existing wires in the name of speed to power. By some bullish estimates, existing grid tech can meet as much as 95% of expected load growth.
There’s been a lot of progress, and a lot of projects, but widespread adoption isn’t happening as quickly as advocates would like, and it’s not evenly distributed across technologies.
Pillar 1: Building a bank of evidence
Many grid-enhancing technologies, a category that includes dynamic line rating, advanced power flow control, and topology optimization, have already been tested in the field, especially for transmission lines. But according to Julia Selker, executive director of the WATT Coalition and director of policy and strategy at Grid Strategies, many of those case studies are happening in other countries that are moving faster than the U.S. One example: Selker highlighted during a panel at San Francisco Climate Week that Northern California’s PG&E is deploying advanced power flow control technology in Silicon Valley in an attempt to boost capacity for new data center development.
And while some DOE funding via the Grid Resilience and Innovation Partnerships program has been pared back, there are still key projects that have retained funding. Selker pointed to Georgia Power’ first large-scale deployment of advanced power flow control technology, an effort funded in part by a $160-million DOE grant. The federal government also appears ready to deploy funding for advanced transmission projects, under the rebranded SPARK program, which will leverage $1.9 billion in remaining unobligated GRIP funding.
Building up that bank of evidence has proved a little trickier on the distribution side, said Morgan Campbell, co-founder and COO of GridIQ, an early stage startup specializing in distribution grid sensor technology and machine learning-based fault detection.
“We need access to a medium voltage line, like 12,000 volts…and so it’s difficult to just find a testing facility and be able to start gathering data,” Campbell explained on the same panel with Selker.
GridIQ does have a handful of pilots, including in Humboldt County, where the company’s realtime data will help a local microgrid re-energize faster after a public safety shutoff. But a partnership with Idaho National Laboratory, which will see the company fold its sensors into INL’s federally-funded wildfire utility research project to gather high-voltage performance data, is also helping it to grow its evidence base..
Pillar 2: Developing operator know-how
As DOE explained in the liftoff report, scaling advanced grid technologies will depend on grid operators knowing how to procure, install, operate, and maintain them. Utility peer-sharing, the report added “reduces information asymmetry…and accelerates overall uptake by reducing barriers to adoption.”
The spread of operational experience today is relatively solid, but still uneven, and pilots aren’t consistently translating into scalable deployments, Selker added: “The thing grid enhancing technologies companies go crazy over is ‘we’ve done 10 pilots…do I have to pilot with all 3,000 utilities before they agree to use my technology?’”
The fragmented regulatory landscape, where FERC, NERC, and state authorities each control different pieces of the puzzle, is a key barrier, and “ends up being a little bit of a game of ping-pong,” she said.
Inside utilities, the operational staff who understand the technologies aren’t always able to influence planners and capital allocators, said Brian Turner, a senior director at Advanced Energy United.
“Operations people, increasingly, thanks to the pilots… know about these technologies and what they can do for the system,” he said. “Do they get them to the people that are arguing over the capital allocation budget? Do they get them to the planners who can use them in the planning timeframe?”
That’s a problem that policy can help with, however. The U.K., for example, uses a model in which if a utility can find a cheaper way than building new infrastructure to solve a constraint or problem, it gets to keep half of the savings.
Pillar 3: Refine the pitch on benefits and costs
Compared to the first two pillars proposed by DOE, Turner said the latter two have a longer way to go.
Part of the problem is that most utility projects evaluate technologies individually, via one-off case studies. The industry needs a consistent framework of technologies that interact with each other and can be used across utilities, Turner said, rather than bespoke integrations. And that framework should include behind-the-meter resources, which the DOE report excluded, he added.
There’s also a mismatch between how new technologies create value and how utilities are allowed to pay for them, Campbell added. GridIQ, for example, would ideally use a monthly subscription model for its operations. The company’s hardware is cheap, but the machine learning models that the company runs to monitor lines in real time is what eats up costs. But paying by the month just won’t work for most utilities, so Grid IQ is considering “how we’re going to do our go-to-market pricing so that we fit better into the capex model.”
Pillar 4: Aligning economic models and incentives
The hardest part of making DOE’s liftoff proposal a reality, said Selker, is aligning utility compensation models with the value generated from advanced grid solutions, as well as the ratepayer benefits.
“I’ll say the closest we’ve gotten to that is the AI boom,” Selker said. “Right now, there’s a real reason for utilities to move as fast as possible to get these new loads plugged in so that they can sell them electricity.”
There are certain first movers taking more creative approaches. National Grid, for example, deployed dynamic line rating on its grid in New York state through its venture capital and innovation arm, National Grid Partners. That allowed the utility to quickly fund and deploy the technology without first seeking regulatory rate approvals. AES, meanwhile, implemented its own DLR deployment in Ohio and Indiana via a direct commercial partnership with LineVision, leveraging the utility’s innovation budget to bypass lengthy rate cases and quickly validate the technology’s financial return.
At scale though, even data center-fueled enthusiasm still runs into a bottleneck. Utilities own their physical wires, and routinely push back against alternative funding models. In the Southwest Power Pool, for example, stakeholders proposed “sponsored upgrades” where the developer or grid user pays 100% of the cost of upgrades, such as installing dynamic line ratings. Utilities in the market, Selker said, “were having none of it.”
There’s increasingly an affordability sell for these technologies. Turner pointed out that high electricity prices in places like California are increasingly driven by transmission, distribution, and disaster remediation — i.e. the very categories where advanced grid technologies could make a dent. On that front, political pressure is building: last week, Pennsylvania governor Josh Shapiro called the traditional utility business model “broken” in a pointed letter to the companies, and pledged to actively oppose rate cases that do not prioritize cost-effective capital.
But until regulatory frameworks evolve to reward performance rather than raw capital spending, it’s likely that the commercial “liftoff” of the technologies will continue to run up against the institutional inertia of the current system.


