Analysis
Sponsored
DATA + CLOUD
Corporates

Where does the AI boom leave data center cooling strategies?

Google is relying mostly on water cooling to keep the energy toll of its data centers down — but that water is increasingly scarce.

|
Published
March 13, 2024
Listen to the episode on:
Apple Podcast LogoSpotify Logo
A sunset over Google data center water storage tanks and cooling towers.

A sunset over Google data center water storage tanks and cooling towers. Photo credit: Google

A sunset over Google data center water storage tanks and cooling towers.

A sunset over Google data center water storage tanks and cooling towers. Photo credit: Google

As artificial intelligence’s data center capacity demands grow, so do concerns about an impending spike in energy demand. In the United States, data center construction is already rising to meet that demand, notably in markets like Salt Lake City, Las Vegas, and Phoenix, where heat and drought add a layer of complexity to an additional challenge: how to cool the computers.

  • The top line: Without a decarbonized grid, cooling the massive uptick in compute power by air could add energy demand to a data center’s already hefty tab, and increase emissions to boot. At the same time, liquid cooling consumes billions of metric tons of water every year, which is of particular concern for the water-scarce regions that also happen to be hotspots for new data center builds. Deciding between these options is among the data center owner’s central conundrums.
  • The market grounding: Generally, there has been limited data on water consumption by data centers — the metric simply hasn’t been as important to developers as local cost of electricity. But as drought plagues certain regions of the country, that has begun to change, prompting tech giants including Google to begin sharing their water stats: in 2021, the average Google data center consumed around 450,000 gallons of water each day. That same year, around a fifth of U.S. data center servers sourced water directly from moderate to highly-stressed watersheds

Ben Townsend, global head of data center sustainability at Google, said the tech giant expects to be using water cooling (as long as the local watershed permits) for most data centers for the foreseeable future. 

That’s in large part because infrastructure for air-cooled data centers — like infrastructure for energy distribution more broadly — has to be built for peak days. That means that, for ten-odd days per year, many top data center locations require up to 50% more energy to air-cool their computers than they do on an average day. The associated additional transmission, distribution, and generation capacity could place data center projects at the mercy of construction timelines and interconnection queues.

And time is of the essence. AI requires more computing capacity more quickly, meaning there’s huge demand for tech that can conduct more calculations and take up less space: in other words, tech that produces more heat. Some estimates suggest that new iterations of graphics processing and central processing units — for example, optimized for workloads like natural language processing and deep learning — can emit up to more than five times more heat than their predecessors of just a few years ago.

Listen to the episode on:
Apple Podcast LogoSpotify Logo

The water option

Water cooling may be better equipped to handle those high temperatures, and on average uses around 10% less energy than air cooling (with the exception of peak days, of course). 

But while Google’s energy efficiency prioritization generally means it opts for water cooling, the company’s “climate-conscious approach” to cooling requires assessing the watersheds of potential data center sites against its water risk framework, Townsend said.

As part of that framework, Google compares a data center’s projected water demand to the available supply, taking into account the projected demands of local communities. The company’s watershed health experts determine the level of risk of depletion or scarcity of a particular location. Then they conduct a feasibility study of potential water treatment and delivery options to and from a data center, before determining whether water cooling is an appropriate option.

“The challenge that we’re facing [in how to cool data centers] is super local,” Townsend told Latitude Media, pointing to geographically dependent elements like carbon intensity, renewables availability, watershed health, and development lead times. 

“It’s a multifaceted challenge that has to be optimized based on every single site we evaluate,” he added.

Though the water risk framework is still in the early stages of application, Townsend pointed to Google’s recently announced data center campus in Mesa, Arizona, as an example of how the evaluation process will work. The center will help power services like Google Search and Gmail, and will use air-cooling due to water scarcity in the Colorado River. 

“We did not feel that that watershed was in healthy shape,” Townsend said. “We opted to not use water resources for data center cooling at that location, and of course that is going to end up with a data center that consumes more energy, which we will have to work really hard to solve for at a grid level in that region.”

Google's comparison of the process of air cooling vs. water cooling a data center. (Image credit: Google)
No items found.
No items found.
No items found.
No items found.
Get in-depth coverage of the energy transition with Latitude Media newsletters

Preventing runaway power consumption

Water cooling may have lower direct consumption of electricity — and companies like Google are increasingly experimenting with alternatives to freshwater, like wastewater or industrial water — but treating that water also takes electricity. The indirect water and carbon footprints of data centers includes consumption and greenhouse gas emissions not only from the operations themselves, but also from the water sources that supply them and from the treatment plants that deal with the wastewater.

However, in the near-term, mitigating the carbon footprint of Google’s data centers will likely entail water cooling, Townsend said. There may come a time when the grid is sufficiently decarbonized such that reducing energy consumption is no longer a good justification for using water, he said, but it’s unclear when that might be.

“What I suspect is that for quite a long time, the answer will be that when there is a sustainable water resource available, that if you can use that water to reduce energy consumption…that’s likely going to be a net beneficial environmental outcome,” he said.

The most optimal best data center sites will be locations with healthy watersheds and cool temperatures, he added, but of course, that’s not always possible — nor is it materializing as the map of data center deployments evolves.

Aside from these real estate considerations, other pathways to data center energy efficiency and to water footprint reduction, many of which Google is already experimenting with, include directly connecting data centers to renewable energy sources, and load shifting capabilities

A recent project with Fervo Energy, for instance, doesn’t directly power Google’s Nevada data centers with geothermal, but does send geothermal to the grid on which the data centers sit, filling in the gaps of other renewable energy sources there.

No items found.