AI’s Cooling Problem: How Data Centers Are Transforming Water Use
- info7277456
- 11 minutes ago
- 3 min read
By: Mara Pusic
The rise of artificial intelligence (AI) and the rapid deployment of high-performance accelerated servers have dramatically transformed the energy use of data centers. U.S. data centers now make up about 4.4.% of electricity consumption nationwide, up from about 1.9% in 2018, and it is predicted that by 2028, this number could climb as high as 12.0%.
Rising energy consumption is not the only environmental concern associated with AI. Data centers also have a thirst for water, as large amounts of water are required both directly for cooling and indirectly for electricity generation and material manufacturing.
It is estimated that U.S. data centers directly consumed 21.2 billion liters of water in 2014 and 66 billion liters in 2023. As an example, recent research has shown that training the GPT-3 language model in Microsoft’s U.S. data centers can directly evaporate 700,000 liters of clean freshwater.
Inside a typical data center, servers use electrical energy to perform calculations and store data, with some of that electrical energy lost as heat. This heat must be removed to prevent the equipment from overheating and breaking down. Historically, data centers have relied on air cooling processes which use fans or air conditioning to circulate chilled air. While this method is energy-intensive, it uses relatively little water. Recently, however, evaporative cooling, which uses water evaporation to cool the air, has become increasingly popular as an alternative. Often used in large-scale operations, this method can be more energy-efficient and handle higher heat load. However, there is an important tradeoff: in this evaporative cooling process, although less energy is used, significant water is lost as it evaporates with waste heat. Essentially, optimizing for energy efficiency can actually worsen water efficiency. Additionally, the water used in this process largely comes from “blue sources,” like surface water or groundwater, that is often purchased from local water utilities. This tension between energy and water use has driven new research into innovative cooling mechanisms such as immersion liquid cooling, a process in which IT equipment is submerged in non-conductive liquid to dissipate heat.
Nonetheless, the vast majority of a data center’s water footprint is from indirect water use. In the U.S., data centers have an indirect water footprint of about 800 billion liters. One way that data centers indirectly consume water is through electricity generation. Electricity generation, particularly from thermal power plants, requires vast quantities of water for steam production and cooling. U.S. power grids have begun to demonstrate a shift toward cooling methods with lower water withdrawls, but ultimately reducing water use from the electricity sector will demand continued coordinated action from developers, utilities, and regulators.
Another way data centers indirectly consume water is from their supply chain activities, such as the manufacturing of servers, chips, and other materials. Producing a single microchip, for example, requires 2.1-2.6 gallons of water just to cool machinery and make sure wafer sheets are free of contaminants.
Despite their resource demands, data centers still account for only a small share of total U.S. water use. Ultimately, water stress is predominately a local issue. While some data centers are in regions with abundant water and easily accessible without competing with other users, others may be built in areas of drought with degrading infrastructure. More than 160 new AI data centers have sprung up across the US in the past three years in places with scarce water resources. The strain often peaks during hot summer months or high electricity demand periods, when cooling systems ramp up and local utilities are already stretched thin. For many communities, the challenge isn’t the total amount of water used, but when and where it is used, and who bears the cost when supplies run low.
As the urgency of water management grows, major hyperscalers like Google and Microsoft have pledged to become Water Positive by 2030, meaning they have committed to returning more water to the environment than they consume. International standards are also evolving. The ISO/IEC’s first international standard on sustainable AI included water footprint as a key metric. Still, transparency remains a major challenge. While most major tech companies now publish some form of water use data, reporting practices vary widely in detail and consistency. This lack of standardized reporting makes it difficult to compare companies' water usage and efficiency or assess progress towards sustainability goals.
Ultimately, the growth of the data center industry represents both a challenge and an opportunity. The Digital Economy & Environment Program (DEEP) at ELI seeks to promote policies around data transparency that would help better inform decisionmakers to proactively plan for siting data centers where clean power and water are more readily available and encourage facilities to operate more flexibly. This can reduce strain on the grid and water utilities during peak need, making data centers an asset as opposed to a drain.
_edited.png)
Comments