Grid edge

Microsoft’s vision for utility AI integration

“I don't worry about the tech being good enough,” said Microsoft’s Hanna Grene. “I worry about how we change the way we’re working and approaching these problems.”

Listen to the episode on:
A color-treated image of Hanna Grene on a purple and red background

Image credit: Anne Bailey / Microsoft

A color-treated image of Hanna Grene on a purple and red background

Image credit: Anne Bailey / Microsoft

In his annual letter to investors this month, Microsoft CEO Satya Nadella laid out his vision for a world reshaped by artificial intelligence. 

“We believe we have now arrived at the next big step forward — natural language — and will quickly go beyond, to see, hear, interpret, and make sense of our intent and the world around us,” he wrote, referring to large language models like ChatGPT.

Microsoft was an early mover in integrating OpenAI’s LLM into its Azure cloud services. And now every part of Microsoft’s technology stack — from cloud infrastructure to data analytics to consumer apps — will be “reimagined” for the AI era, said Nadella. As a result, every industry will inevitably be impacted by AI. 

Utilities will also find themselves at the center of this shift, even if most aren’t yet actively investing in AI for grid management. Generative AI will increasingly start to influence back-office operations and customer support inside utilities for “focus and efficiency,” explained Microsoft’s Hanna Grene, on stage at Latitude Media’s Transition-AI: New York conference.

“For me, AI is an ‘unlock’ in how we work on problems. And for folks who might not have seen some of the power of digital technology, if their kids are showing them on their phones what AI is capable of, it's an ‘aha’ moment for people,” said Grene.

As global operations and go-to-market leader of energy and resources at Microsoft, Hanna Grene collaborates with utilities and other large energy companies on using cloud services (and increasingly artificial intelligence) in service of decarbonization. 

These applications include methane detection, data analysis, and distributed energy management systems. But the most common uses inside utilities in the near-term, said Grene, will likely revolve around improving internal processes.

Grene sat down with Latitude Media co-founder and executive editor Stephen Lacey to share her thoughts on which utilities are best poised to take advantage of AI applications.

But, as she told Lacey, accelerating AI integration isn’t just about improving the technology — it’s also about people and process.

Their conversation has been edited for brevity and clarity.

Listen to the episode on:

Stephen Lacey: Microsoft has identified hundreds of potential applications for AI and machine learning. What rises to the top?

Hanna Grene: There are four areas where Azure OpenAI is really finely tuned: summarization, semantic search, code generation, and content generation. 

We’re also pulling those generative AI capabilities into a number of our products, and we’re calling those Copilots. The areas where I’m seeing customers apply those are in predictive analytics, forecasting, health and safety, supply chain, and customer service.

In this industry in particular, we have so much intelligence scattered throughout our organizations, and there's a lot of time spent hunting for the right information to do your job. With some of these datasets, you can now use that semantic search capability: your teams can search, they can summarize, they can find the data faster, and they can go back to the job that they set out to do in the first place. That, for me, drives focus and efficiency.

Also, utilities have more data than all of our peer industries. We have more data than healthcare, than finance; they would be jealous of our datasets. That to me is the second application to unlock — we can get serious about our systems, about our networks, about our business. And we can ask questions of our data without a Python coder having to sit next to a regulatory attorney.

SL: Tell us about the applications of AI for methane detection — what tools are you using?

HG: We've partnered really closely with Accenture on this, and they've developed a methane emissions platform and toolset. They’re working on it for oil and gas and in the power and utility space. We've been using AI to help determine where to place the sensors so that you find the leaks as quickly as possible and measure leak amounts. This means you can ensure remediation, and you can do a quality control check that that remediation was successful. 

In any of these cases, we're looking for and currently trialing applications of AI to learn from the patterns of where we're seeing leaks in the systems. And then there’s the question of how we get from what the platform can do today — detect, remediate, measure, resolve — to the next phase, which is detect before a leak even happens. That's really the project.

SL: You have a background in distributed energy resource management systems, and you’re partnering with Schneider Electric on a new DERMS system run in Azure. Tell me about what you're building there and where AI plays a role.

HG: We were able to just announce in July a DERMS system, built fully in the cloud and deployed in the cloud, with Schneider Electric that's available on Azure. Back in May of 2022, we came together with Pacific Gas and Electric and Schneider Electric, and thought: what would it take to deploy a DERMS system that was scalable, secure, faster to meet PG&E’s very pressing and current needs on their grid?

We brought in experts from around the world, from Schneider Electric and from Microsoft, to sit with the PG&E team, and we asked hard questions. And by the end of the year, December 2022, we were ready to start phase one of deployment. I have never seen teams move that hard and that fast at breakthrough innovation for a system of that scale and ambition. So this DERMS system will help balance loads to support growing electrification. 

There are also some really exciting AI use cases in DERMS. Your DERMS system actually can become a giant data set and you could start to think about querying using that semantic search. You could ask your DERMS system, “what's the average load on that feeder in March?” And then you could say, “okay, how much of that is from EVs?” That is exciting. 

And then another area where we're going to see breakthrough opportunities with AI and DERMS is forecasting. Forecasting is a huge opportunity for AI in general. Assume 40% of your customer base has EVs with level-two charging. Assume 20% of your generation comes from DERs. That is a level and complexity of forecasting that we do not have today. And the distribution grid and AI is going to be a really powerful tool to unlock that. 

SL: We can’t have this conversation without talking about the energy intensity of AI and particularly large language models. How are you managing that? How does it make your decarbonization goals more complicated?

HG: As we go forward into this next stage of AI, we are not backing off of our climate goals. By 2025, we are committed to meeting our full load with 100% carbon-free electricity, and that is a robust goal already. But then when we look out to 2030, we are committed to being carbon-free as a company, to removing diesel gensets as a backup power source to our data centers, which by the way, means we need a really, really, really reliable grid, globally, and so we’ve got to work with you on that. We're a large load. And we need a lot of infrastructure growth to help support our growing needs in the space.

SL: Who is best prepared to take advantage of new emerging technologies? I assume there are utilities who’ve been working on this for a while.

HG: Yeah, and what they’ve been working on is the data. AI is only as good as your data. That’s just the foundational truth of all of this work. So the companies that have been on that journey for a while have these big, pure pools of data; they have well-organized data estates; they’re bringing in data from their business processes, and from their grid, and from their generation. Those are the companies that are already harvesting those really rich insights into the future of their business.

The more we fill that pool with the right data — and the more we leverage AI to make those datasets clean — that’s where the real outcomes are. The equation I always come back to is: business insights are a function of [computing power] and close proximity to data. That’s where you’re going to get the results that you’re really looking for.

SL: What’s your vision for where things are going in terms of integrating AI through the end of the decade? Will we still be in the experimentation phase?

HG: We don't have time to still be in the experimentation phase by 2030. And so, for me, the excitement around this moment is an opportunity for those of us invested in accelerating the energy transition. Let's harness this excitement. Let's start changing the way that our teams work, the way that we collaborate. Let's partner with your controls partners, your metering providers, your technology partners. Let’s bring the best minds together to start approaching these problems differently, to focus on scale and repeatability and how we’re delivering these essential systems faster.

At the end of the day, that’s really what’s going to make this happen: people and teams and talent and partnerships. I’m already seeing breakthrough applications in this sector, so I don't worry about the tech being good enough. I worry about how we change the way we’re working and approaching these problems: how do we partner more effectively?

No items found.
No items found.
No items found.
No items found.
No items found.