The real impact of Google’s most recent efforts to develop artificial intelligence on its greenhouse gas emissions can be seen in Google’s most recent monthly climate report.
The company produced 14.3 million tonnes of carbon monoxide alternatives in 2023 as a result of the company’s growth of its data centers to assist AI improvements. This is a 48 % increase over the same amount for 2019 and a 13 % increase over the same period since 2022.
According to the study’s authors,” this effect was mostly caused by increases in data center energy use and offer string pollution.”
Reduced emissions may be challenging as we add more AI to our products because of the higher power demands posed by the higher level of AI compute and the anticipated higher costs associated with the investment in specialized infrastructure.
Observe: How Microsoft, Google Cloud, IBM &, Dell are Working on Reducing AI’s Climate Harms
Google claims it is unable to tell apart the portion of the overall data center pollution that AI is to blame for.
By 2030, Google pledged to have net-zero pollution for all of its activities and price chain. The report states this goal is now deemed “extremely ambitious” and” will require ( Google ) to navigate significant uncertainty”.
The report continues by stating that the company can simply release info centre-wide measures that include cloud storage and additional operations because the economic effects of AI is” complex and difficult to predict”. This means that the environmental harm that AI training and use specifically caused in 2023 is being kept secret for the time being.
That being said, in 2022, David Patterson, a Google engineer, wrote in a blog,” Our data shows that ML training and inference are only 10 % –15 % of Google’s total energy use for each of the last three years”. However, this proportion is likely to have increased since then.
SEE: All You Need to Know About Greentech
Why does AI contribute to tech companies ‘ higher emissions?
Like most of its competitors, Google has introduced a number of AI projects and features over the last year, including Gemini, Gemma, Overviews and image generation in Search and AI security tools.
AI systems, particularly those involved in training large language models, demand substantial computational power. This translates into higher electricity usage and, consequently, more carbon emissions than normal online activity.
SEE: Artificial Intelligence Cheat Sheet
According to a study by Google and UC Berkeley, training OpenAI’s GPT- 3 generated 552 metric tonnes of carbon — the equivalent to driving 112 petrol cars for a year. Additionally, according to studies, generative AI systems consume about 33 times more energy than task-specific software-running machines.
Last year, Google’s total data centre electricity consumption grew by 17 %, and while we do n’t know what proportion of this was due to AI- related activities, the company did admit it “expect ( s ) this trend to continue in the future”.
Google is not the first of the major tech companies to report that AI advancements are causing a ton of emissions and that managing them is becoming challenging. Microsoft announced in May that its emissions would increase by 29 % from 2020, primarily as a result of the company’s construction of new data centers. In part because of our position as a leading cloud provider with expanding data centers, according to Microsoft’s report on environmental sustainability.
According to a report obtained by Business Insider in April, leaking documents reveal that Microsoft has reportedly acquired more than 500MW of additional data center space since July 2023 and that its GPU footprint now allows live” AI clusters” in 98 locations around the world.
Four years ago, Microsoft President Brad Smith referred to the company’s pledge to become carbon negative by 2030 as a “moonshot”. However, in May, he admitted that” the moon has moved” since then and is now “more than five times as far away”, via Bloomberg’s Zero podcast.
Alex de Vries, the founder of digital trend analysis platform Digiconimist, which tracks AI sustainability, thinks that Google and Microsoft’s environmental reports prove that tech bosses are not taking sustainability as seriously as AI development. They might think so, he said in an email to TechRepublic, but the reality is that they are currently clearly prioritizing growth over meeting those climate targets.
Google is already having trouble getting renewable energy sources to meet its growing energy demand. Every MWh that Google consumes is producing more carbon. We only have a small supply of renewable energy sources available globally, and the demand for electricity related to AI is already too high. To make those climate targets realizable, something will have to drastically change.
Google’s skyrocketing emissions may have a trickle-down effect on the businesses using its AI products, which each have their own environmental standards and standards to follow. ” If Google is part of your value chain, Google’s emissions going up also means your Scope 3 emissions are going up”, de Vries told TechRepublic.
How Google is managing its AI emissions
Google’s environmental report identifies a number of ways the company is controlling the energy demands of its AI developments. Its latest Tensor Processing Unit, Trillium, is over 67 % more energy efficient than the fifth generation, while its data centres are over 1.8 times more energy- efficient than typical enterprise data centres.
Google’s data centers now offer roughly four times as much computing power with the same amount of electrical power as they did five years ago.
In March 2024 at NVIDIA GTC, TechRepublic spoke with Mark Lohmeyer, vice president and general manager of compute and AI/ML Infrastructure at Google Cloud, about how its TPUs are getting more efficient.
He said,” We use liquid cooling for those TPUs to run a highly efficient form of accelerated compute with our own in-house TPUs, which makes them run faster, but also much more energy-efficient, and as a result, much more cost effective.”
Google Cloud also uses software to manage up- time sustainably. ” What you do n’t want to have is a bunch of GPUs or any type of compute deployed using power but not actively producing, you know, the outcomes that we’re looking for”, Lohmeyer told TechRepublic. Additionally, sustainability and energy efficiency are both important factors in promoting high levels of infrastructure use.
According to Google’s 2024 environmental report, the company is controlling the effects of AI in three ways:
- Model optimisation: For example, it boosted the training efficiency of its fifth- generation TPU by 39 % with techniques that accelerate training, like quantisation, where the precision of numbers used to represent the model’s parameters is reduced to decrease the computational load.
- Efficient infrastructure: Its fourth- generation TPU was 2.7 times more energy- efficient than the third generation. In 2023, Google’s water stewardship program offset 18 % of its water usage, of which much goes into cooling data centres.
- Emissions reduction: Last year, 64 % of the energy consumed by Google’s data centres came from carbon- free sources, which include renewable sources and carbon capture schemes. Additionally, it installed demand response capabilities and carbon-intelligent computing platforms at its data centers.
Additionally, Google’s AI products are being developed to address climate change in general, such as flood prediction models, fuel-efficient routing in Google Maps, and the Green Light tool, which helps engineers adjust traffic lights ‘ timing to reduce stop-and-go traffic and fuel consumption.
Emission goals could be overstated by AI demand.
Google claims that only about 0.1 % of the world’s electricity demand comes from its data centres, which, among other things, power its AI activities. Indeed, according to the International Energy Agency, data centres and data transmission networks are responsible for 1 % of energy- related emissions.
However, this is expected to increase significantly over the coming years, with a projected double-digit electricity consumption for data centers between 2022 and 2026. According to SemiAnalysis, data centres will consume about 4.5 % of global energy demand by 2030.
The training and operation of AI models in data centers requires a lot of energy, but the production and transport of the chips and other hardware also contributes. Due to the rising demand, the IEA predicts that AI will use 10 times as much electricity in 2026 as it did in 2023.
SEE: AI Causing the Power and Cooling Problem in a Foundational Data Center in Australia
Additionally, data centers require a lot of water to cool them, and even more so when running energy-intensive AI computations. By 2027, according to a study from UC Riverside, the amount of water that is withdrawn for AI activities could equal the equivalent of half the country’s annual consumption.
Increased electricity demand could lead to a return to non-renewable energy for tech companies.
Tech companies have long been significant investors in renewable energy, with Google’s most recent environmental report claiming to have purchased more than 25 TWh worth of it in 2023 alone. However, there are concerns that coal and oil-powered plants will continue to operate at levels that would have otherwise been shut down due to the skyrocketing energy demand caused by their AI endeavors.
For example, in December, county supervisors in northern Virginia approved for up to 37 data centres to be built across just 2, 000 acres, leading to proposals for expanding coal power usage.