The tech giant's data and climate lead Savannah Goodman says performance tracking and policy are up next.
Photo credit: Sebastian Gollnow / picture alliance via Getty Images
Nearly four years ago, Google first unveiled its carbon-intelligent computing platform. In the years since, it has enabled the company to match resource-intensive data center tasks to times when the grid can supply low-carbon power.
At the same time, the U.S. market for distributed energy resources has nearly doubled, up to an estimated $45 billion per year; transmission bottlenecks have come into the spotlight; billions of federal dollars have poured into tech solutions; and, of course, there has been a truly staggering boom in artificial intelligence.
In October, Google announced its latest effort to mitigate data center consumption: a demand response tool built on the same underlying infrastructure as its carbon-intelligent computing solution. Google data and climate science lead Savannah Goodman told Latitude Media that, while increased data availability has enabled the new tool, the move is reflective of the growing need for grid flexibility.
“We’ve been working with utilities to say, ‘We have these load flexibility capabilities; how can we co-optimize or manage them in a way so that we can also do what’s most useful for the grid?’” Goodman said.
The result of that collaboration is a series of pilot programs in which grid operator partners notify Google about a forecasted supply constraining event (like extreme weather), and the relevant data centers receive instructions to limit non-urgent tasks for a specific window of time.
While the broader computing platform prioritizes decarbonization through daily load shifting in response to the grid’s carbon profile, Goodman added, this latest tool also prioritizes grid stability.
Very little new data or hardware were needed to set up these demand response pilots, Goodman said, because utility partners are primarily using already available data to inform when to send Google a signal to lower demand. Google’s algorithms then generate an enhanced capacity curve that creates hour-by-hour use instructions, which can include pausing flexible workloads like YouTube video processing or rerouting certain tasks to data centers on a different power grid.
The data behind the alerts themselves is all being run on the utility side of this partnership, Goodman said, but utilities can share key data points to help Google optimize its tool.
“In order to prepare for the pilot, they can share the number of demand response events they’ve seen in a typical summer, how long they typically last, what hours they’re typically at,” she said. “That can help us run our own simulation models internally to see, ‘Okay, how much capacity do we think we’ll be able to respond with to this event? How many events do we think we can handle per season?’”
Utilities’ increased ability to predict events on the grid is key to enabling the pilots, Goodman said. But Google itself also has more data than ever about the shape of data center workloads, and has gotten more comfortable with the levels of flexibility they can introduce to those workloads.
Google’s decarbonization goals are aggressive, and the company has just six years to meet its target net zero date of 2030. In 2023, the company reported that its Scope 2 emissions — which come mainly from data centers — accounted for 24% of its global carbon footprint. Access to renewable energy on the grid is, of course, essential to the company’s decarbonization plans; and at the moment, there isn’t enough.
Grid flexibility also plays a key role in enabling Google’s ambitious targets: “By providing demand flexibility, that can help increase the amount of variable renewables that are able to exist and be managed on the grid,” Goodman said, adding that Google has so far had “really good” results when it comes to timely load shedding.
“When we get a notification, we’ve seen very consistently that we are actually responding to those needs on the grid,” she said. She declined to share specific metrics for that success, but added that every pilot region is different in terms of how much capacity Google can share.
The pilots — including in Oregon, Nebraska, Taiwan, Ireland, and Denmark, to name a few — have so far focused on whether and to what extent the data centers can respond. In the future, Goodman will be looking to predict response performance, and gather more intel about a response’s overall impact on the grid.
While commercial demand response is typically associated with factories and production, and is a fairly new concept for data centers, Goodman said Google is starting to have conversations with other large cloud providers about grid flexibility.
“It’s important to see other kinds of large energy consumers participate,” she said. “That’s something we want to work on on the policy and advocacy side, is how we can work together to create new policy frameworks that better incentivize this type of participation from large scale energy buyers.”