By. Michela Taufer, Ph.D. and Chandra Krintz, Ph.D.
From advanced weather forecasting systems to precision medicine tailored to people, AI is rapidly transforming virtually every sector of industry and culture. However, as we unlock the full potential of AI, society faces a worrying paradox: the technology used to solve global problems could exacerbate one of our most pressing challenges: climate change.
Recent projections show that by 2030, AI could consume up to 21% of the world’s electricity supply. This staggering figure is highlighted by the recent announcement that Microsoft plans to reactivate a decommissioned nuclear power plant to power a data center. Such developments highlight the urgent need to consider the environmental impact of AI, even as humanity harnesses it to push the boundaries of science to solve problems and improve lives. We need to ask ourselves, are we “solving” our way to disaster?
As part of our work with the Computing Research Association’s Computing Community Consortium (CCC) Sustainability and Climate Resilience Working Group, we are fortunate to have a platform at the SC24 conference. We’ll take this opportunity to bring together experts from science and industry to consider some of the most important questions technology leaders should ask themselves as they seek to balance the benefits of AI with the threats of climate change . We will challenge the AI and high-performance computing (HPC) communities to think about the growth trajectory of AI and its environmental implications, and we will ask questions that need to be at the forefront of discussions about the future of AI. Here, in advance, we present three such questions for leaders around the world to consider in hopes of sparking an industry-wide debate.
How can AI continue to drive innovation while minimizing environmental damage?
By now, we can all agree that AI has enormous potential, but we must ask whether the growing environmental cost of developing AI is undermining its benefits. Large language models like GPT or BERT require vast computational resources and, therefore, enormous amounts of energy. A single query on ChatGPT consumes approximately ten times more energy than a traditional Google search. These operations take place in large data centers that consume more and more energy. As AI continues to evolve, its energy requirements and carbon footprint threaten to undermine the benefits it promises to deliver.
Individuals tend to give up and decide that problems like these are too big for them to solve, but a local effort could be a crucial starting point. This looming peril calls for a concerted effort by the HPC community to prioritize energy efficiency in the design of AI systems down to code, where energy-efficient hardware architectures are combined with optimized algorithms that mitigate the impact of AI on carbon production. This means developing a workforce that understands the system-level implications of AI training, including energy consumption, data movement, and associated costs. Instead of only focusing on improving AI capabilities, it is important to prioritize the effectiveness of these AI systems in order to achieve a sustainable balance.
Communities engaged in AI development must be aware that while innovation is the goal, achieving a balance is essential. It is their responsibility to ensure that future AI technologies not only solve complex problems, but also with minimal impact on the environment.
What research gaps need to be filled to ensure AI development aligns with the Sustainable Development Goals?
While we have some understanding of specific cases of AI energy consumption (i.e. ChatGPT queries), our broader understanding of AI’s overall carbon footprint remains limited. We lack comprehensive knowledge about the trade-offs between performance and efficiency in different AI systems. The growing environmental burden of AI could outweigh the promised benefits without a concerted effort to close these gaps.
Many areas require further research, including additional research into energy-efficient algorithms and software layers and exploration of alternative architectures, such as neuromorphic computing and quantum computing, as accelerators of energy efficiency. New sustainability practices need to be explored, including creating energy efficiency metrics and benchmarks to measure the impact of these innovations.
We also need to develop educational programs that prepare professionals early in their careers with a fundamental understanding of the system-level impacts and implications of AI. This is particularly necessary in the design of energy-efficient systems, where optimizing data movement and minimizing energy consumption are key to sustainable AI development. By creating a culture of IT sustainability, we can prepare future generations of AI researchers to tackle these challenges from the start.
A holistic approach is essential. We must develop a workforce capable of driving responsible innovation, balancing technological progress and environmental stewardship. Now is the time to lay the foundations for a future where the full potential of AI is realized without sacrificing the health of our planet.
How can collaboration between environmental technologists and scientists lead to advances in sustainable AI practices?
Technologists alone cannot develop sustainable AI solutions. The challenges are too complex to be solved by a single discipline. We need strong collaborations that bring together the expertise of technologists, environmental scientists, ethicists, and other fields to solve these problems.
Our next panel at SC24 is intended to be a starting point for this collaborative approach. We have assembled a “dream team” of experts with domain expertise to bring unique perspectives to the discussion and help navigate these complex questions. Through this approach, we hope to identify new pathways to reduce the environmental impact of AI while pushing the boundaries of innovation.
Our message is clear: collaboration is essential to develop next-generation solutions that strike a balance, ensuring we do not preempt the benefits of innovation at the cost of irreversible climate damage. By leveraging the knowledge of leaders in different fields, including accelerated computing architectures, advanced cooling technologies, renewable energy integration, improved data center design, and even policy and governance, we can develop more efficient and environmentally friendly AI systems.
This dialogue with diverse perspectives can be a valuable catalyst for change. We encourage the HPC and AI communities to actively engage with experts from adjacent domains and disciplines to identify areas where sustainable AI practices can be co-developed. Through these partnerships, new doors will open to address these complex challenges, and innovations will impact society while protecting the environment.
The way forward
The future of AI requires sustainability to be a priority rather than an afterthought. These are not just technical challenges, but also moral imperatives that require immediate attention. We encourage our colleagues to engage in these critical conversations, participate in relevant forums, and join us in Atlanta this November to ensure these discussions take root in the broader community.
The impacts of AI will extend across many disciplines, from public health to agriculture, making the quest for sustainable AI a technical challenge and a societal necessity. By working together, we can ensure that the transformative potential of AI is realized in a way that respects and conserves our planet’s resources. Let’s rise to this challenge and shape a future where technological progress and environmental management go hand in hand.
About the authors
Dr. Michela Taufer is a Dongarra Professor at the University of Tennessee at Knoxville and Vice Chair of the ACM SIGHPC, leading research on scientific applications on heterogeneous platforms and AI for cyberinfrastructure.
Dr. Chandra Krintz is a professor at the University of California, Santa Barbara and co-founder of AppScale Systems, Inc., which focuses on the intersection of IoT, cloud computing and data analytics with applications in the fields of agriculture, livestock and ecology.