Water, More than Power, May Prove Critical Challenge for Future AI Data Computing
Most of the storylines focused on the future of data center and artificial intelligence (AI) demand growth have fixated on the challenge of developing enough power to energize next-generation computing, but the seemingly insatiable need for energy may be creating a deeper natural resource drama.
A new report by Bluefield Research warns that within four years 72% of total water consumption by data centers will happen off-site as part of dedicated power generation. An executive summary of the “Water-Power Nexus” report notes that data center power demand may double in the coming decade adding more than 100 GW of new generation, whether it’s gas-fired power or nuclear.
These types of baseload power plants use millions of gallons of water for steam generation and cooling. Add that to the already high-level inudstrial water needs of a high-level computing factory with AI work and perhaps you’ve got a clean water supply crisis in the making.
“While data centers use water directly for on-site cooling, the rapid growth in electricity demand is driving up the sector’s overall water use outlook,” Amber Walsh, research director at Bluefield Research, said in the summary. “Power—not on-site cooling—is becoming the defining water risk for the data center industry.”
Current on-site data center water consumption totals less than 20 billion gallons per year, according to U.S. federal statistics used by Bluefield Research. This could grow several billion gallons by 2030, by some 11% per year, according to the Bluefield graphic.
According to the report, indirect data center water consumption, which includes power generation related to the computing centers, is around 54 billion gallons annually but may nearly double to 94 billion gallons per year by 2030, according to the report. This dramatic uptick in power generation may happen because the AI race and Industrial Compute Age could delay coal-fired power plant retirements and lead to recommissioned nuclear facilities, as well as the rush to build new natural gas-powered generation.
“We’re at an inflection point, and the decisions being made today about where to site gigafactories and how to power them will shape U.S. water availability for decades to come,” says Walsh. “Increasingly, water access has become a critical lever, alongside electricity prices, that communities and local governments are wielding to set the terms of engagement with data center developers.”
A report quoted by the Environmental Law Institute’s website indicated that just the training of GPT-3 language model in a Microsoft data center could evaporate 700,000 liters of water.
About the Author
EnergyTech Staff
Rod Walton is head of content for EnergyTech.com. He has spent 17 years covering the energy industry as a newspaper and trade journalist.
Walton formerly was energy writer and business editor at the Tulsa World. Later, he spent six years covering the electricity power sector for Pennwell and Clarion Events. He joined Endeavor and EnergyTech in November 2021.
He can be reached at [email protected].
EnergyTech is focused on the mission critical and large-scale energy users and their sustainability and resiliency goals. These include the commercial and industrial sectors, as well as the military, universities, data centers and microgrids.
Many large-scale energy users such as Fortune 500 companies, and mission-critical users such as military bases, universities, healthcare facilities, public safety and data centers, shifting their energy priorities to reach net-zero carbon goals within the coming decades. These include plans for renewable energy power purchase agreements, but also on-site resiliency projects such as microgrids, combined heat and power, rooftop solar, energy storage, digitalization and building efficiency upgrades.
