As the daily use of AI has exploded in recent years, the energy demand of the IT infrastructure that supports it has also increased. But the environmental impact of these large data centers, which consume gigawatts of energy and require enormous quantities of water for cooling, is too diffuse and difficult to quantify.
Now, Cornell researchers have used advanced data analytics — and, of course, a little AI, too — to create a state-by-state overview of this environmental impact. The team found that, by 2030, the current rate of AI growth would release 24 to 44 million tons of carbon dioxide into the atmosphere each year, the equivalent of emissions from 5 to 10 million cars on U.S. roads. It would also drain 731 to 1.125 million cubic meters of water per year, equivalent to the annual household water consumption of 6 to 10 million Americans. The cumulative effect would put the AI industry’s net-zero emissions targets out of reach.
On the positive side, the study also presents a concrete roadmap that would use smart siting, faster grid decarbonization and operational efficiencies to reduce these impacts by approximately 73% (carbon dioxide) and 86% (water) compared to worst-case scenarios.
THE the results were published on November 10 in the sustainability of nature. The first author is doctoral student Tianqi Xiao from Engineering of process-energy-environment systems (PEESE) laboratory.
“Artificial intelligence is changing every sector of society, but its rapid growth comes with a real energy, water and carbon footprint,” said Fengqi youRoxanne E. and Michael J. Zak Professor of Energy Systems Engineering at Cornell Engineering, who led the project. “Our study is designed to answer a simple question: Given the scale of the AI computing boom, what environmental trajectory will it take? And more importantly, what choices will steer it toward sustainability?”
To quantify the environmental footprint of the country’s AI IT infrastructure, the team three years ago began compiling “multiple dimensions” of financial, marketing and manufacturing data to understand how the industry is growing, combined with location-specific data on power systems and resource consumption, and how they relate to climate change.
“There’s a lot of data, and it’s a huge effort. Sustainability information, like energy, water, climate, tends to be open and public. But industrial data is difficult, because not all companies report everything,” You said. “And of course, ultimately we will still have to consider several scenarios. It is not possible to find a universal solution. Each region is different in terms of regulations. We have also used AI to fill part of the data gap.”
But projecting the impacts was not enough. The researchers also wanted to provide data-driven guidance for sustainable growth of AI infrastructure.
“There is no silver bullet,” you said. “Site selection, grid decarbonization and operational efficiency work together: that’s how you get reductions of around 73% for carbon and 86% for water. »
By far one of the most important factors: location, location, location.
Many of today’s data clusters are being built in water-scarce regions, such as Nevada and Arizona. And in some areas, such as northern Virginia, rapid consolidation can strain local infrastructure and water resources. Locating facilities in areas with less water stress and improving cooling efficiency could reduce water demand by approximately 52% and, combined with operational and network best practices, total water reductions could reach 86%, according to the study. The Midwest and “wind belt” states – particularly Texas, Montana, Nebraska and South Dakota – would offer the best combined carbon-water profile.
“New York remains a low-carbon, climate-friendly option with its clean power mix of nuclear, hydropower and growing renewables,” You said, “even as water-efficient cooling and additional clean energy are prioritized.” »
If decarbonization fails to meet computing demand, emissions could increase by around 20%.
“Even if every kilowatt hour becomes cleaner, total emissions may increase if demand for AI grows faster than the grid decarbonizes,” You said. “The solution is to accelerate the transition to clean energy where AI computing is growing. »
However, there is only so much decarbonization of the grid can do. Even in the ambitious renewable energy sector Under this scenario, by 2030, carbon dioxide would decline by about 15% compared to the baseline, and about 11 million tons of residual emissions would remain, requiring about 28 gigawatts of wind power or 43 gigawatts of solar capacity to reach net zero.
Researchers determined that deploying a range of energy and water-saving technologies, such as advanced liquid cooling and better server utilization, could potentially eliminate an additional 7% of carbon dioxide and reduce water consumption by 29%, for a total water reduction of 32% when combined.
As companies such as OpenAI and Google pour more and more money into rapidly building AI data centers to meet demand, this is a crucial time for coordinated planning between industry, utilities and regulators to avoid a local water shortage and higher emissions on the grid, according to You.
“It’s construction time,” he said. “The AI infrastructure choices we make this decade will determine whether AI accelerates climate progress or becomes a new environmental burden. »
Co-authors include researchers from KTH Royal Institute of Technology in Stockholm, Sweden; Concordia University in Montreal, Canada; and the European Institute of Economics and Environment RFF-CMCC in Milan, Italy.
The research was supported by the National Science Foundation and the Eric and Wendy Schmidt AI in Science Program.
