Google’s carbon emissions have surged by 48% in five years, driven by the explosive growth of artificial intelligence (AI).
This highlights a major challenge in the rapid development of AI technology: its insatiable energy consumption.
Why Does AI Consume So Much Energy?
The language models underpinning generative AI require immense computing power to train on billions of data points, necessitating powerful servers.
Each time a user interacts with ChatGPT or any other generative AI, it activates servers housed in data centers. These servers consume electricity, generate heat, and require cooling systems that further increase energy consumption.
According to the International Energy Agency (IEA), data centers typically use about 40% of their electricity to power servers and another 40% for cooling.
Multiple studies have shown that a single query to ChatGPT consumes an average of 10 times more energy than a simple Google search.
Data Center Expansion and Rising Energy Demand
The AI boom since 2022 has prompted internet giants like Amazon, Google, and Microsoft to invest heavily in data center construction worldwide.
Google’s environmental report highlights the increased energy consumption in its data centers and the surge in emissions associated with building new facilities and upgrading existing ones.
How Much Energy Does AI Consume?
Before the AI craze, data centers accounted for roughly 1% of global electricity consumption, according to the IEA.
With the inclusion of AI and the cryptocurrency sector, data centers consumed nearly 460 terawatt-hours (TWh) of electricity in 2022, representing 2% of total global production.
This figure could double by 2026, reaching 1,000 TWh, equivalent to Japan’s electricity consumption, the IEA warns in a report.
Alex de Vries, an economist at the Vrije Universiteit Amsterdam, modeled the electricity usage required for AI alone based on sales projections from Nvidia, whose processors are essential for training AI models.
If Nvidia’s 2023 sales estimates are accurate and all servers operate at maximum capacity, they could consume between 85.4 and 134 TWh annually, comparable to Argentina’s consumption.
De Vries noted that his figures were conservative, as they didn’t account for factors like cooling requirements. Last year, Nvidia’s sales exceeded projections, suggesting that the actual energy consumption could be even higher.
How Are Data Centers Handling Increased Energy Demand?
AI will transform the data center sector, acknowledges Fabrice Coquio of Digital Realty, which operates a massive data center near Paris, partially dedicated to AI.
Coquio compared the impact of AI to that of cloud computing, potentially even more significant in terms of deployment.
While average-power servers can be housed in rooms with air conditioning, the high-powered servers required for AI generate more heat and necessitate pumping water directly into the equipment for cooling.
Coquio emphasized that AI demands different servers, storage, and communication equipment.
How Are Tech Giants Responding?
As tech giants strive to integrate more AI into their products, experts fear a surge in electricity consumption.
Like Google, Microsoft, the world’s second-largest cloud provider, saw its CO2 emissions jump by 30% in 2023 compared to 2020.
While Google, Amazon, and Microsoft highlight their investments in renewable energy to power their data centers, their carbon neutrality goals seem increasingly distant.
AWS (Amazon’s cloud division) has pledged to become a net-zero carbon company by 2040, while Google aims to achieve net-zero emissions across all its operations by 2030.
Microsoft also set a goal of achieving a negative carbon footprint by 2030, a promise made before the AI explosion, admitted Microsoft President Brad Smith in a May interview with Bloomberg.