
That’s according to a new study led by Xiaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside. The findings – which have not yet been peer-reviewed, but are publicly available on the preprint server arXiv – show that limited public water capacity is emerging as a significant barrier to data center growth.
To avoid burdening local ratepayers, tech companies are partnering with communities to raise money to upgrade water infrastructure, often spending hundreds of millions of dollars. “Those companies are driven by profit, right? So I think there’s clearly something wrong,” Ren told Gizmodo.
Why so thirsty?
Data centers operate continuously, generating a lot of heat from the dense concentration of servers, networking equipment and other forms of IT infrastructure. Liquid cooling technologies are the most effective way to prevent overheating and system failure, but they consume excessive water.
Tech companies will often argue that by using “closed-loop” cooling systems, their data centers recycle most of the water they use and reduce consumption. But these systems can also consume large amounts of water because many rely on evaporative cooling towers to transfer heat out of the facility.
For example, according to the study, the peak daily water demand for a large state-of-the-art data center using evaporative cooling – the amount needed during the hottest days of the year – can often exceed 1 million gallons per day, and for some planned facilities it could reach 8 million gallons per day.
water blockage
Public water systems are engineered to reliably meet maximum demand at all times, so a data center’s peak water usage is an important factor in infrastructure planning, system resiliency, and operational reliability. Despite this, most operators only disclose their total annual water use. To estimate the peak water demand of US data centers, Ren and his colleagues analyzed data from public sources, including government records and water utility databases.
It showed that if current water use intensity continues, US data centers will need 697 million to 1.45 billion gallons of new peak water capacity by 2030. This is equivalent to the typical daily water supply of New York City. The cost of building this additional capacity could range between $10 billion and $58 billion, with most of the financial burden falling on the communities hosting the data centers.
And that’s a “very conservative” estimate, Ren said. His team’s calculations assume a maximum-to-average daily water use ratio of only 4.5, which is at the low end of the spectrum.
This presents many problems for the technical field. Insufficient water capacity can directly impact the feasibility and efficiency of data center projects, leading to increased costs, delays, and scalebacks. This can also lead to operational inefficiencies, as data centers often have to switch to dry cooling using air instead of water when water becomes unavailable. This is much less efficient and increases electricity demand, putting further strain on the grid during the peak of summer.
Ren and his colleagues have some ideas about how to address the growing water capacity demand of US data centers. First, they emphasize the importance of requiring data centers to report not just their total annual usage, but their peak demand. They also recommend developing corporate-community partnerships to finance infrastructure upgrades so that residents do not have to bear all the burden.
“I don’t see any way for them to afford this type of upgrade,” Ren said. “We need corporate funding and support.”
As data centers continue to proliferate across the country, the technology sector will be forced to grapple with this often overlooked hurdle. If nothing changes, these companies will face consequences along with the communities they impact.
<a href