Many major utilities have spent the last several years in an artificial intelligence exploration phase, developing their own grid planning and data analytical tools and experimenting with pilots and proofs of concept.
But now, the sheer scale of demand growth combined with extreme weather is serving as a “forcing function” for their adoption of AI and data-intensive cloud services, said Howard Gefen, general manager for energy and utilities at Amazon Web Services.
As the cloud provider behind much of the energy sector’s software, AWS has had a front row seat to how electric utilities — and their counterparts in the oil and gas sector — are adopting AI. The company has long pushed the power sector to lean into the “fail fast” ethos of the tech world, even as it grapples with the challenges posed by slow utility adoption of these advanced technologies, as well as data silos across legacy systems.
In the last year and half, a handful of utility use cases for AI have moved from proof of concept into production, spearheaded by a few early adopters, Gefen said, pointing to Duke Energy, which is deploying a suite of intelligent grid services across the core of its business.
Now, the rest of the industry is in an in-between phase, watching first movers share the results of their models and coalesce around some best practices — and ideally smooth the way for others to follow.
“The culture of failing fast and experimenting and going through what we call two-way doors — where you can make a decision and then iterate on it — takes a little longer to take hold in [utilities],” Gefen said. However, he added, when AI can prove itself on a time-intensive manual task, like performing a grid analysis in 15 minutes that incorporates all of the company’s enterprise data, it changes how the utility thinks about its business. “And that is happening.”
Early use cases
For now adoption is still concentrated among a few key use cases, according to Gefen, including grid optimization and analysis. Duke Energy, for example, has been working with AWS for more than three years to automate some of its grid management work.
On the interconnection front, Duke and AWS developed an automation and validation tool, deploying specialized agents that act as virtual engineers, orchestrating parallel simulations, validating against regulatory standards, and even integrating data from Duke’s partnerships in the world of AI asset inspection. That rollout has taken the complete process, including study preparation, from months down to hours.
The utility has done the same for its distribution network, migrating dozens of planning applications to the cloud to conduct more accurate and granular electricity demand forecasting. One key application is an hour-by-hour projection for every customer meter over the next 11 years, and optimized solution calculations for new load requirements.
Running those simulations with on-premise technology took up to six weeks of computing time, Gefen said. Moving it to the cloud removes the need to spend millions of dollars in new hardware and licenses, and helps the company identify alternatives to traditional infrastructure upgrades, like demand response and other operational changes that squeeze more out of existing assets.
AWS is presenting that work to Duke’s peers as both a solution to the looming speed-to-power problem and a reference project for others to copy, Gefen said, but the catch-up period will take some time.
Parallel moves
Utilities aren’t known for being fast-moving. But in the oil and gas industry, which has a longer history with advanced computing, adoption and rollout is moving at a rapid clip, Gefen said.
That’s largely because the sector is more concentrated, with around 20 major players, compared to the thousands of utilities in the U.S. alone. Those companies also aren’t regulated like utilities are, making them more competitive with each other, less risk-averse, and extremely focused on operational efficiency.
In that hypercompetitive landscape, even a one-dollar reduction in the cost of producing a barrel of oil is a massive win. And in the last few years, Gefen said, companies have started to recognize AI as a core profit driver, rather than an IT experiment.
Bpx Energy, BP’s U.S. onshore oil and gas business focused on shale production in Texas and Louisiana, has saved an estimated 15,000 hours annually by automating well performance analysis. One part of that innovation has included the development of a tool that acts much like ChatGPT for engineers, enabling them to query pressure trends or well performance anomalies in plain language. The underlying AI then writes the computer code needed to fetch the data, and responds back.
Gefen is confident utilities will soon reach the same conclusion that oil and gas has. Once they make the leap to cloud-based solutions for a few concrete problems, it starts to change how the whole operation operates, he added, enabling both faster decision-making and better efficiency. And in the near future, he expects that virtuous cycle to pull even the most cautious utilities along.
“In the next five years, everybody will move,” he said. “It’s always quicker than you think in the long term, and longer than you think in the short term.”


