The Environmental Cost of AI | Climate Crisis

Generative AI services use a lot of electricity and water, and create a lot of e-waste. The ecological impact of the technology is just beginning to be studied and discussed.

Generative AI services use a lot of electricity and water, and create a lot of e-waste

Last year Google announced that it was significantly behind on a pledge to become carbon neutral by 2030. “Its emissions are actually up nearly 50 percent since 2019—one factor: artificial intelligence [AI] and the energy required to power it via the company’s massive data centers,” PBS NewsHour anchor Amna Nawaz explained during a recent network segment on AI and energy.

Generative AI tools such as Google’s Gemini, Open­AI’s ChatGPT, and Microsoft’s Copilot have been rapidly changing the way people search and discover content, write, create, code, and more. Search Google lately, and the result at the top of the page is likely an AI overview from Gemini. Microsoft began adding Copilot to Windows 11 PCs via updates last year. And according to a recent report by CNBC, ChatGPT’s weekly active user base surpassed 400 million in February, up 33 percent from December 2024. “People hear about it through word of mouth. They see the utility of it. They see their friends using it,” OpenAI COO Brad Lightcap told CNBC, explaining ChatGPT’s rapid growth. “There’s an overall effect of people really wanting these tools, and seeing that these tools are really valuable.”

This rapid growth requires a lot of energy usage, and the ecological impact of generative AI is just beginning to be studied and discussed. While many technology companies publish environmental reports, most haven’t begun breaking out specific information about their AI initiatives. “It remains very hard to get accurate and complete data on environmental impacts” of AI, Kate Crawford, a professor at the University of Southern California Annenberg and senior principal researcher at Microsoft Research, writes in a February 2024 article for Nature. “The full planetary costs of generative AI are closely guarded corporate secrets. Figures rely on lab-based studies by researchers such as Emma Strubell [assistant professor, Language Technologies Institute, School of Computer Science at Carnegie Mellon University] and Sasha Luccioni [climate lead at Hugging Face, a global startup in responsible open-source AI]; limited company reports; and data released by local governments. At present, there’s little incentive for companies to change.”

WATER, WATER, EVERYWHERE

For example, Crawford notes that a lawsuit by residents of West Des Moines, IA, revealed that a local Microsoft data center that was training GPT-4 for OpenAI used 6 percent of the district’s water in July 2022—the month before training was complete. A 2023 Associated Press (AP) report explains that for most of the year, Microsoft’s data centers in Iowa can be air cooled, but during the summer—anytime it’s at least 85˚F outside—the centers require an enormous amount of fresh water for cooling. Microsoft’s 2022 environmental sustainability report disclosed that the company’s global water consumption increased 34 percent from 2021 to 2022, to nearly 1.7 billion gallons—a spike that correlates with the company’s investments in training generative AI. Google also reported a 20 percent increase in water usage during that time.

Shaolei Ren, associate professor for the University of California, Riverside’s Electrical and Computer Engineering Department, told AP he estimates that ChatGPT uses almost 16 ounces of water when responding to a series of five to 50 queries. “The range varies depending on where its servers are located and the season. The estimate includes indirect water usage that the companies don’t measure—such as to cool power plants that supply the data centers with electricity,” according to the AP.

Water is needed for cooling because training AI uses a lot of electricity, causing data center equipment to generate heat. According to “AI and energy: Will AI help reduce emissions or increase demand? Here’s what to know,” a report published by the World Economic Forum in July 2024, some estimates indicate that a ChatGPT query may require as much as 10 times the electricity of a regular internet search. Also, training GPT-3 is estimated to have consumed 1,300 megawatt hours of electricity, equivalent to the annual power consumption of 130 U.S. homes, while training the more recent, more advanced GPT-4 used 50 times more electricity. And ChatGPT is just one of the many generative AI services that have launched in recent years. “Overall, the computational power needed for sustaining AI’s growth is doubling roughly every 100 days,” the report states.

WHAT A WASTE

Maintaining this growth also requires data centers to periodically upgrade or replace aging equipment ranging from circuit boards and batteries used for backup power to memory modules and the graphics processing units (GPUs) whose parallel processing capabilities are crucial for AI algorithms. The depreciation/refresh cycle for data centers is usually three to five years. Citing “E-waste challenges of generative artificial intelligence,” an academic study published in Nature Computational Science last October, a recent IEEE Spectrum article by Katherine Bourzac notes that continued rapid growth of generative AI “could result in an annual e-waste stream of 2.5 million tonnes [2.5 billion kilograms] by 2030.” The study isn’t even comprehensive—it focused exclusively on computationally intensive large language models (LLMs) like ChatGPT. “Including other forms of AI would increase the projected e-waste figures,” one of the study’s authors, Asaf Tzachor, a sustainability and climate researcher at Reichman University in
Israel, tells Bourzac.

E-waste poses significant problems for both the environment and human health. Discarded electronics that end up in landfills often contain toxic metals such as lead, mercury, and cadmium that can leach into the soil and contaminate groundwater. Rare earth metals are also found in several types of electronic devices, and while some methods for extracting those metals from recycled e-waste can create separate problems by melting down electronics or using chemicals to dissolve them, the alternative to recycling is throwing these rare earth metals away and mining for more. The United Nations Institute for Training and Research (UNITAR) reported in its Global E-Waste Monitor 2024 that only one percent of rare earth element demand is currently being met by e-waste recycling.

To be fair, even the 2.5 million metric tons of e-waste that generative AI data centers could potentially be creating by 2030 would be a relatively small contribution to an already massive environmental problem. The UNITAR report notes that 62 million metric tons of e-waste were produced in 2022, enough to “fill 1.55 million 40-tonne trucks, roughly enough trucks to form a bumper-to-bumper line encircling the equator.” Trends point to e-waste growing 32 percent to 82 million metric tons by 2030, and e-waste generation is currently rising five times faster than documented e-waste recycling, according to the report. Currently, small devices such as toys, microwave ovens, vacuum cleaners, and e-cigarettes account for 33 percent of e-waste (20.4 million metric tons), while small IT and telecommunication equipment such as laptops, mobile phones, GPS devices, and routers account for 7.4 percent of e-waste (4.6 million metric tons), with only a 22 percent documented collection and recycling rate.

DOING THE RIGHT THING?

Companies including Google, Microsoft, and OpenAI at least publicly say that they are committed to reducing the environmental impact of technology, including generative AI. In response to AP’s questions about its water usage for AI training, Microsoft issued a statement saying, “We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power data centers, purchasing renewable energy, and other efforts to meet our sustainability goals of being carbon negative, water positive, and zero waste by 2030.” And Microsoft’s new “Circular Centers” process, recycle, and repurpose old hardware from its cloud computing data centers—with these processes partly facilitated by the use of AI—indicating a commitment to reducing e-waste at the corporate level. In his interview with IEEE Spectrum, Tzachor suggests that government regulations may be needed to ensure that technology companies adopt e-waste reduction strategies.

There is hope of a corporate sort that AI can be leveraged to mitigate greenhouse gas (GHG) emissions in the future. A 2023 report by Boston Consulting Group coauthored with Google suggests that AI could help reduce GHG five to 10 percent by 2030 and “significantly bolster climate-related adaptation and resilience initiatives.”

Hardware manufacturers also continue to develop higher powered and more energy efficient components. For example, GPU manufacturer Nvidia’s Blackwell AI “superchips” launched late last year reportedly deliver 30 times performance improvement while using 25 percent less energy when training AI models, according to the company. If these advances continue, AI data centers will need less electricity and water, even as they continue to proliferate.

And then there’s the case of DeepSeek. Developed in China and launched in January, the AI-powered chatbot’s creators claim it is as powerful as ChatGPT, despite being developed at a fraction of the cost and with a fraction of the energy use. Its launch and consumer-facing performance led to a sell-off of Nvidia shares that wiped out almost $300 billion of the high-flying company’s value in less than a month. OpenAI quickly sued DeepSeek’s creator for intellectual property theft, claiming that it had bombarded ChatGPT with queries and used the responses to “distill” its models. It’s an interesting case, since OpenAI is essentially arguing that ChatGPT—an AI model that has been trained entirely on the copyrighted and public domain works of others—was generating original content in its responses. But even if DeepSeek was using ChatGPT to develop its model, it may just present a new, more environmentally friendly way forward for generative AI, where multiple corporations aren’t working to invent the same wheel.

Author Image
Matt Enis

menis@mediasourceinc.com

@MatthewEnis

Matt Enis (matthewenis.com) is Senior Editor, Technology for Library Journal.

0 COMMENTS
Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.
Fill out the form or Login / Register to comment:
(All fields required)

RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?