Generative AI: The Energy Behind Each Answer

You’ve seen the videos—robots taking over cities, sparing only those who were kind to them back when they were just chatbots. So, to be safe, you type “thank you” after every prompt. It seems harmless: a safety measure in case the joke does come true. But behind those small words lies a cost far more immediate than a robot apocalypse.

Every prompt you enter into generative AI sets off a series of chemical reactions behind your computer screen. Your words travel through wires and networks until they reach a data center, a massive building packed with powerful graphics processing units (GPUs). Instead of looking up the answers, these machines run complex calculations in seconds to formulate an entirely new response for you. While regular search engines only access information, generative AI makes its replies completely from scratch.  All this intense computation doesn’t come for free; powering AI models requires enormous amounts of electricity. Such heavy workloads can require hundreds or even thousands of times the processing power and electricity of a basic web search.

As the demand for computing power increases, data centers are consuming more and more energy. As a result, the rapid expansion of AI is significantly increasing global electricity demands, exacerbating the pressure on our already strained energy grids. Data centers account for 1 to 1.5% of global electricity consumption today, translating to about 415 terawatt-hours (TWh) per year, more electricity than entire countries use annually. As we incorporate AI tools into our daily lives, that number is expected to more than double by 2030, reaching a staggering 945 TWh.

AI consumes energy on an unprecedented scale; the emissions from training a single AI model are estimated to be between 200 and 500 metric tons of CO2, roughly equivalent to the carbon footprint of dozens of households. Generative AI’s thirst for power doesn’t stop at just electricity; these facilities operate around the clock, with servers producing intense heat while running. To prevent overheating, large-scale cooling systems circulate large volumes of water. In some cases, millions of liters of water are used annually just to keep the machines from overheating. In addition to its cooling requirements, the production of GPUs themselves requires metals such as lithium and cobalt, and their extraction brings habitat destruction, toxic waste, and high carbon emissions. 

So what happens when you send that “thank you” follow-up? It may seem insignificant, but it kicks off the same energy-hungry course; the GPUs whir, the servers compute, and more electricity and water are consumed. In a world already facing an energy crisis—worsened by global conflicts and unstable oil supplies—this rising demand pushes already limited resources closer to their limits. What feels like a small and harmless action is, in reality, contributing to an increasing environmental toll—one that extends far beyond our screens.

Sources:

https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about 

https://insighttechtalk.com/tech-news/stop-saying-please-thank-you-ai-cost/ 

https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117 

https://esamurai.com/2025/08/14/ais-hidden-environmental-cost-why-its-far-more-harmful-than-search-engines/ 

https://www.iea.org/reports/energy-and-ai/ 

https://www.devera.ai/resources/the-environmental-impact-of-ai-energy-carbon-and-water-in-the-age-of-chatgpt 

Next
Next

AI: What Will It Do to Our World?