Generative AI is energy-intensive, and the ways in which its economic effects can be calculated are difficult. Ponder the downstream impact of relational AI on the atmosphere when examining your business ’s own sustainability targets.
- What side results might not be immediately apparent but could have a significant impact?
- When does most of the electricity usage occur: during training or regular use?
- Do “more efficient ” AI models actually address any sustainability problems?
The effect of relational AI on electricity generation, waters, and heat quality
AI’s impact on air pollutants
In December 2024, the University of California, Riverside, and California Institute of Technology calculated that education Meta’s Llama-3. 1 produced the same amount of air pollution as more than 10,000 square visits by auto between Los Angeles and New York City.
The increased air pollution from backup generators at data centers running AI caused regional public health costs of approximately$ 190 million to$ 260 million a year, the UC Riverside and Caltech researchers found.
AI’s effect on power use
A 2024 statement from the International Energy Agency said one ChatGPT fast used 10 terawatt-hours more light per month than the total used every for Google queries.
AI’s effects on water usage
Sapping more power was battle now struggling services, leading to brownouts or disruptions. Drawing waters from now drought-prone areas, such as the fast developing Phoenix, Arizona or the plains of California, may cause biodiversity loss and wildfires.
Observe: Sending One Email With ChatGPT is the Equivalent of Consuming One Bottle of Water
Do education or regular use of AI consume more sources?
“Training is a time-consuming and energy-intensive process, ” the IEA wrote in its 2025 Energy and AI World Energy Outlook Special Report. One GPU of the kind suited for AI teaching draws about as much energy as a toast at its optimum rated energy consumption. The firm calculated it took 42. 4 gigawatt time to teach OpenAI’s GPT-4, the equivalent of the normal household power usage of 28,500 homeowners in an innovative market.
What about everyday use? Query size, model size, the degree of inference time-scaling, and more factors into how much electricity an AI model uses during the inference stage of use, to parse the prompt. These factors, and a lack of data regarding the size and implementation of consumer AI models mean the environmental impact is very difficult to measure. However, generative AI undeniably draws more power than conventional computing.
“The inference phase ( also the operational phase ) was already responsible for the majority (60 % ) of AI energy costs at Google even before mass adoption of generative AI applications happened ( 2019-2021 ), ” wrote Alex de Vries, founder of the research blog Digiconomist and the Bitcoin Energy Consumption Index, in an email to TechRepublic. “Even though we don’t have exact numbers, mass adoption of AI applications will have increased the weight of the inference ( /operational ) phase even further. ”
Meanwhile, AI models continue to expand. “Increasing the model size ( parameters ) will result in better performance, but increases the energy use of both training and inference, ” said de Vries.
DOWNLOAD: This Greentech Quick Glossary from TechRepublic Premium
DeepSeek claimed to be more energy efficient, but it ’s complicated
DeepSeek’s AI models have been lauded for achieving as much as their major competitors without consuming as much energy and at a lower price tag; however, the reality is more complicated.
DeepSeek’s mixture-of-experts approach reduces costs by processing relationships between concepts in batches. It does n’t require as much computational power or consume as much energy during training. The IEA found that the everyday use of the inference time scaling method used by DeepSeek-R1 consumes a significant amount of electricity. Generally, large inference models consume the most electricity. The training is less demanding, but the usage is more demanding, according to MIT Technology Review.
“DeepSeek-R1 and OpenAI’s o1 model are substantially more energy intensive than other large language models, ” wrote IEA in the 2025 Energy and AI report.
The IEA also pointed out the “rebound effect, ” where the product’s increased efficiency leads to more users adopting it; as a result, the product continues to consume more resources.
Can AI offset the resources it consumes?
Tech companies still like to present themselves as good stewards. Google pursues energy-conscious certifications globally, including signing the Climate Neutral Data Centre Pact in Europe. Microsoft, which saw similar increases in water and electricity use in its 2024 sustainability reporting, is considering reopening a nuclear power plant at Three Mile Island in Pennsylvania to power its AI data centers.
SEE: The proliferation of AI has created a sustained boom in data centers and related infrastructure.
Supporters of AI might argue its benefits outweigh the risks. Generative AI can be used in sustainability projects. AI can help comb through massive datasets of information about carbon emissions or track emissions of greenhouse gases. Additionally, AI companies are continually working on improving the efficiency of their models. But what “efficiency ” really means always seems to be the catch.
“There are some bottlenecks ( like e. g. grid capacity ) that could hold back the growth in AI and its power demand, ” said de Vries. “This is hard to predict, also considering that it ’s not possible to predict future demand for AI ( for example the AI hype could fade to a certain extent ), but any hope for limiting AI power demand comes from this. Due to the ‘bigger is better ’ dynamic AI is fundamentally incompatible with environmental sustainability. ”
Then there is the question of how far down the supply chain AI’s impact should be counted. “Indirect emissions from the consumption of electricity are the most significant component of emissions from hardware manufacturing [of semiconductors, ” said the IEA in the Energy and AI report.
The cost of hardware and its use has gone down as companies understand the needs of generative AI better and pivot to products focused on it.
“At the hardware level, costs have declined by 30 % annually, while energy efficiency has improved by 40 % each year, ” according to Stanford University’s 2025 AI Index Report.
DOWNLOAD: This IT Data Center Green Energy Policy from TechRepublic Premium
Consider how generative AI affects your business ’ environmental targets
Generative AI is becoming mainstream. Microsoft’s Copilot is included by default in some PCs; smartphone makers are eagerly adding video editing AI and assistants; and Google gives out its Gemini Advanced model for free to students.
Tech companies that set promising sustainability targets may find it difficult to hit their goals now that they produce and use generative AI products.
“A I can have dramatic impacts on ESG reports and also the ability of the companies concerned to reach their own climate goals, ” said de Vries.
DOWNLOAD: This Customizable Environmental Policy from TechRepublic Premium
According to Google’s 2024 Environmental Report, the tech giant’s data centers consumed 17 % more water than in 2023. Google attributed this to “the expansion of AI products and services ” and noted “similar growth in electricity use. ” Google’s data center waste generation and water use both increased.
“As AI adoption accelerates, IT leaders are increasingly aware that smarter devices don’t directly correlate to more efficient power consumption, ” said Dan Root, head of global strategic alliances at ClickShare. “The spike in compute demand from AI tools means IT departments must look for offset opportunities elsewhere in their stack. ”
As the International Energy Agency pointed out in its 2024 electricity report, both the source of electricity and the infrastructure need to be considered if the world is to meet the energy demands of AI.
“You can make/keep models a bit smaller to reduce their energy requirement, but this also means you have to be prepared to sacrifice performance, ” said de Vries.