Feature
Sponsored
AI
DATA + CLOUD

The AI ‘wild card’ promises to complicate clean clouds

The unpredictability of energy demand is throwing a wrench in the Big Three’s ambitious clean energy plans.

Listen to the episode on:
A data center's infrastructure, rendered in blues and reds.

AI-generated image credit: Gold Flamingo

A data center's infrastructure, rendered in blues and reds.

AI-generated image credit: Gold Flamingo

Since 2019, the Big Three cloud providers — Microsoft, Google, and Amazon — each set new goals to meet 100% of their energy use with clean power. 

But 2023 threw a wrench in those plans. While their targets haven’t changed, the world around them has: explosive demand for artificial intelligence is complicating once-predictable energy use.

This is because many applications of AI are trained and operated on the cloud. Data centers, where the cloud runs on rows of hard drives, already account for 1% to 1.5% of global greenhouse gas emissions. While their electricity use has stayed relatively flat over the past decade, thanks to impressive gains in energy efficiency, that may change as demand becomes harder to predict. 

AI requires much more extensive computing power than is typically required by these centers. Training a single model can take an estimated 1.3 gigawatt hours, and a new study projects that by 2027, worldwide AI energy demand may match the current footprint of Sweden. The growth of AI is all but certain to expand the quantity and power use of data centers. And for the Big Three, this means that AI is complicating their procurement plans for renewable energy. 

The avalanche of AI

This year, fascination with AI hit the cloud industry like a wave. Between 2022 and 2023, references to AI during first quarter corporate earnings calls leapt by 77%. The count rose even higher by the summer.

The avalanche of interest caught many by surprise. With customers and investors suddenly eager to explore AI, cloud providers must now build, retrofit, and redesign their infrastructure, even as the long-term outlook for AI demand remains, well, cloudy. For the Big Three and other cloud providers committed to lowering their emissions, this push comes with added pressure of ensuring their clean energy procurement plans account for the surge. 

Listen to the episode on:

“I don't think anyone fully anticipated how quickly AI was going to inject itself into the collective consciousness,” said Brian Janous, former vice president of energy at Microsoft, who stepped down after 12 years with the company in August. “There's certainly been a moment for everyone over the last year, everyone who's close to this issue, where they've been like, ‘Oh sh*t, this is for real. This is real demand, real customer growth, real applications that are being built on this,” he said.

For Janous, that moment came in March, with the release of GPT-4, the latest edition of ChatGPT. The new model could run circles around its predecessor, scoring in the 88th percentile on the law school admissions exam, up from the 40th. 

“It was the moment I realized that the iterations in terms of speed and quality of development that we were seeing in the AI space [were] happening so much faster than the market was recognizing the need to build the infrastructure to support it,” Janous said, “That was the moment that I realized we've got a serious challenge ahead of us.”

New 24/7 clean power goals expose data centers to variability in wind and solar, forcing them to build additional capacity. (AI-generated image credit: Gold Flamingo)

Precipitating clean clouds

Over the past decade, the tech industry has been a key driving force for clean energy. Tech companies collectively bought more than 36.7 gigawatts of power purchase agreements for wind and solar, amounting to nearly half of all corporate procurement between 2014 and 2022. These efforts — led by the Big Three, together with Meta — bridged the gap between the once-emerging renewable energy industry and the corporate push to take action on climate change.

There’s a distinction, however, between meeting 100% clean energy on paper and meeting 100% clean energy around the clock. To say they’ve reached 100%, some companies will simply purchase enough PPAs to match their annual energy use, even if power from those projects doesn’t directly flow to their facilities. Accordingly, Amazon’s current goal to match its global energy use with PPAs will be relatively flexible in the age of AI. 

Google and Microsoft, however, are aiming for a much harder goal of 24/7 clean power by 2030. The new targets will require covering any variability in wind and solar with storage or alternative sources, like geothermal energy.The real challenge of this goal is ensuring that clean energy infrastructure — and the flow of electrons — physically connects to all operations, including new hyperscale data centers. It exposes data centers to variability in wind and solar, forcing them to build additional capacity. One estimate by the Uptime Institute found that to meet 100% carbon-free energy from wind alone, data center operators must install three to five times the amount of megawatts they actually need. 

Companies also need to work directly with utilities, grid operators, and the government to build transmission lines to their data centers. Interconnection remains a critical challenge in the U.S., as evidenced by the two terawatts of renewable energy awaiting eventual connection to the grid. Unlike the renewable energy buildout, which tech companies could simply finance, this next stage requires working with industries that operate at a slower pace.

“There’s a crunch for energy everywhere,” said Neha Palmer, co-founder and CEO of Terawatt, who helped electrify Google’s data center operations until 2021. “It is incumbent upon utilities and grid operators to start moving faster.” 

Meanwhile, data centers face their own supply chain constraints. And with growing awareness about the energy and water they consume, new sites are increasingly hard to find.

It is incumbent upon utilities and grid operators to start moving faster.
Neha Palmer, co-founder and CEO of Terawatt

Now, the pressure to prepare for AI is reverberating well beyond the major cloud providers, said Arman Shehabi, staff scientist at Lawrence Berkeley National Lab, who studies data center power use. Operators are ramping up existing efforts to upgrade cooling systems and optimize capacity use. Given the high concentration of data centers in strategic areas, such as northern Virginia, the impacts of AI won’t be felt evenly across the country. 

“Really we're talking about a fraction of a percent increase overall,” Shehabi said. “But it could be doubling or tripling in one particular region.”

Flexible workflows and the AI wild card

Exactly how much AI energy demand will grow — and where — are the key questions looming over the cloud industry. Early reports suggest that AI has already had a hand in rising electricity bills, but it’s still too soon to say how many customers will adopt the technology and how they will use it. Ben Hertz-Shargel, global head of grid edge at Wood Mackenzie, points to several key variables, including the demand for AI training and applications of ‘green code,’ which developers use to reduce the carbon footprint of software.

AI computing may be thought of in three stages: data pre-processing, training, and use, Hertz-Shargel said. The first two stages can take place anywhere in the world, while actually using AI tools must take place at a nearby data center. Accordingly, cloud providers looking to decarbonize could shift the energy-intensive training stage to regions with abundant renewable energy, or at least avoid training during times of peak energy demand. This solution requires some buy-in from cloud customers, which are starting to use green code to coordinate workloads with clean electricity production. 

“AI presents a significant opportunity, because the lion's share of computing within AI is a flexible type of workflow,” Hertz-Shargel said. “It can't all be done and it shouldn't all be done by data center operators requiring 100% clean energy — I think that's too much of a challenge in the near future.”

No matter how green the code, however, training is still one of the most energy-intensive parts of AI development. The technology processes millions or even billions of data points to improve its accuracy, so if many companies want to train their own AI model on private datasets, overall energy demand will almost certainly see an alarming increase. 

According to data from Wood Mackenzie, the energy consumption per training has increased exponentially since 2018 — by an average of 7.8 times per year — and it’s due to continue growing as more companies adopt AI. Whether this will be mitigated by better efficiency of hardware, software, and operations is hard to say, but Hertz-Shargel believes operating a large cloud operation in the future won’t be as simple as ramping up renewable procurement. Indeed, some data centers may be seeking relief in nuclear energy. 

In statements to Latitude Media, spokespersons for Amazon and Microsoft underscored their companies’ renewable energy achievements thus far, with Amazon highlighting its strategy to scale on- and off-site renewables.

“Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” a spokesperson wrote via email. Google declined to comment. 

While many are hesitant to sound the alarm on AI energy use, it’s clear that companies are watching carefully. “For years, I've talked about EV adoption and industrial electrification, and no one was talking about AI,” Janous said. “But suddenly you inject this thing like AI into the mix, and you took what was already a big challenge for utilities, and then you just see this wild card.”

No items found.
No items found.
No items found.
No items found.
big data
Alphabet
Amazon/AWS
Microsoft
energy distribution
Google