The chairman of Digiconomist, a exploration firm that evaluates the economic impact of technology, Alex de Vries-Gao, has published the new study in a criticism. In the late 2010s, De Vries-Gao started Digiconomist to investigate the effects that bitcoin mining, another exceedingly energy-intensive action, would have on the culture. He claims that the need to look at AI has become more serious over the past few years as a result of the widespread adoption of ChatGPT and other powerful speech models that require a lot of energy. By the end of this year, according to his studies, global demand for AI power is currently expected to surpass demand for bitcoin mining.
He claims that the money cryptocurrency miners spent to get to where they are now is incomprehensible in comparison to the wealth Google, Microsoft, and all these major tech companies are putting into AI. This is only increasing more quickly and poses a greater risk.
Big Tech’s weather targets are already being affected by the development of AI. In subsequent sustainability information, tech giants have acknowledged that AI is largely to blame for their higher energy consumption. For instance, Google’s greenhouse gas emissions have increased by 48 % since 2019, which makes it more difficult to achieve net zero by 2030.
Reduced emissions may be challenging as we more incorporate AI into our products because of rising energy demands from the higher level of AI compute, according to Google’s 2024 sustainability report.
In a statement released last month, the International Energy Agency discovered that data centers accounted for 1.4 percent of the world’s energy consumption in 2024, or 415 terrawatt hours, less than Saudi Arabia’s annual energy demand. This figure will only increase as demand increases: energy use in data centers has increased four times faster than total consumption in recent years, and investment in data centers has nearly doubled since 2022, mainly due to enormous expansions to account for fresh Artificial capacity. By the end of the decade, the IEA forecast that information center power consumption may exceed 900 TWh.
However, there are still many questions about the percentage of electricity that data centers use at the moment, especially AI. Data centers provide a variety of services, including having cloud services and building online infrastructure, that aren’t obviously connected to AI’s energy-intensive tasks. However, software companies tend to keep their software and hardware costs under control.
Some attempts to calculate AI’s power usage have started with the user, such as figuring out how much electricity goes into a single ChatGPT research. De Vries-Gao chose to concentrate on the supply chain, starting from the manufacturing side, to provide a more global perspective.
De Vries-Gao points out that the current global supply chain has a natural “bottleneck” in the area of AI hardware, particularly around Taiwan Semiconductor Manufacturing Company ( TSMC), the undisputed leader in producing key components that can meet these requirements. Firms like Nvidia contract with TSMC, which also produces bits for companies like Google and AMD. ( Both TSMC and Nvidia both declined to comment on this article. )
De Vries-Gao compiled an exact measure of TSMC’s manufacturing capability using analyst estimations, earnings call transcripts, and gadget details. He therefore calculated the approximate share of global data-center need that is absorbed by AI hardware based on publicly available energy consumption profiles and estimates on usage rates of that hardware, which may vary depending on what it’s being used for. Without more output, according to De Vries-Gao, AI will use up to 82 terrawatt hours of electricity this year, roughly the same as Switzerland’s annual electricity consumption. Desire was increase at a similar price, accounting for nearly half of all data middle demand by the end of the year, if production capacity for AI hardware doubles this year, as analysts have predicted it does.
Despite the amount of information that is readily available in the paper, a lot of what De Vries-Gao is doing is peering into a box: We simply don’t know certain things about how the industry might develop in the future, like how much of the use of each piece of AI hardware is consumed, or how the industry might fare in the future.
Given the volume of unknowns at play, Sasha Luccioni, a researcher who studies AI and energy and is the climate lead at the open-source machine-learning platform Hugging Face, advised against leaning too far on some of the new paper’s conclusions given the lack of conclusions. Disclosure from tech giants is crucial, according to Lucioni, who was not involved in this study.
” We don’t have the information that [researchers ] need to do this,” she says. ” That’s why the error bar is so large,” the statement reads.
And tech firms do retain this data. Google published a paper on machine learning and electricity use in 2022, noting that it made up” 10 % to 15 % of Google’s total energy use” between 2019 and 2021 and that it had predicted that” by 2030 total carbon emissions from training will reduce.” However, Google has not provided any more detailed information about how much electricity ML uses since that paper, which was released before Google Gemini’s debut in 2023. ( Google declined to comment on this story. )
De Vries-Gao claims that you must “deep-dive into the semiconductor supply chain” to be able to make a valid assessment of the energy demand of AI. We would have a good indicator of AI’s energy use if these big tech companies were just publishing the same information as Google was publishing three years ago.