I Keep Hearing About the Environmental Costs of Using AI, But What Are They?


By Jaci Cooper, Director of Projects and Planning

Using tools like ChatGPT, Claude or Copilot feels like the same behavior as a Google search—type a question, get a response. But the output is faster, more detailed and free of ads and clutter. Still, behind that smoother experience, the environmental impact, specifically the carbon and water footprint, of large language models (LLMs) is greater than most people realize.  

Recently, I started seeing posts online comparing AI use to bottles of water and warning of its waste. That sounds surprising — after all, it’s just a computer program, right? But these comparisons are based on how much water is used to cool the powerful machines that run AI. Here’s how it works: large language models like ChatGPT or Grok run on huge computer systems stored in buildings called data centers. These machines get hot when they run, especially during training or when many people are using them. To keep them from overheating, cold water is used in a process called evaporative cooling. As the water absorbs heat, it evaporates into the air and is gone. It can’t be reused because it disappears as steam. Newer systems can recycle some water, but most data centers today still use older methods.  

The exact estimates are unclear: OpenAI’s CEO estimates each ChatGPT query uses about 0.3 mL of water, researchers from UC Riverside estimate a 100-word response can use around 500 mL, and MIT News reports that AI cooling demands can strain local water systems, potentially drawing millions of liters per day for large facilities — but the bottom line is that water use is a real concern AI companies will need to address.  

Another clear example of this impact is happening in Memphis, Tennessee. A massive data center has been built there to support AI technology. It has permits to run 15 methane gas turbines, and another 9 on site, which experts say could make it the largest source of air pollution in the city. That’s a big deal for a community already facing high rates of asthma and other health issues. Local leaders and health officials are pushing back, worried about the effects on the people who live nearby.  

While the exact numbers about water and energy use are still being studied, one thing is clear: using AI tools has real-world environmental costs. These systems don’t just “live in the cloud.” They run on machines that use electricity, burn fuel and need water to stay cool.  

These technologies are becoming more embedded in much of what we do, so how can you be a more mindful user?   

Use AI only when it adds value  

Prioritize tasks that truly benefit from it—like summarizing long documents, generating creative ideas or translating languages. Avoid using AI for things you could easily look up or do quickly yourself. This reduces the number of server requests, and over time, lessens energy and water demand. “Treat AI like you would your car—great for longer trips, but not worth starting up just to walk to the mailbox.” — Jevin West, University of Washington, Center for an Informed Public  

Batch your questions  

Instead of sending one question at a time, gather your thoughts and ask more complete prompts. Fewer, more thoughtful interactions reduce the energy load across large language models (LLMs).  

Use low-energy tools when possible  

Basic search engines or local tools (like offline apps or PDF readers) use far fewer resources than cloud-based LLMs. Reserve AI tools for higher-level reasoning, not fact-checking or spelling.  

Support platforms that report environmental data  

Experts from MIT and The Alan Turing Institute urge users to choose companies that publish their environmental practices, use renewable energy and invest in efficient cooling. Transparency helps drive accountability.  

Push for sustainable tech policies  

As a user, you can support companies and organizations that are advocating for more sustainable AI infrastructure, like using reclaimed water, efficient chips or clean energy. 

Want to learn more about this topic? Check out this book from our collection: 

Atlas of AI. Power, Politics, and the Planetary Costs of Artificial Intelligence | Kent District Library | BiblioCommons 

Here are some links used to inform this post and others for further reading: 

The MIT Press. (2025, January 17). Explained: Generative AI’s environmental impact. MIT News. 

Jegham, Nidhal, et al. How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference. arXiv, 14 May 2025. 

Zhang, P., et al. (2023). Making AI less “thirsty”: Uncovering and addressing the secret water footprint of AI models (arXiv preprint arXiv:2304.03271). arXiv. 

This post is part of a blog series on AI, where we break down big ideas into simple, practical insights.