AI's Voracious and Dangerous Appetite for Energy

especiales

AI's Voracious and Dangerous Appetite for Energy
Fecha de publicación: 
2 April 2025
0
Imagen principal: 

There is much talk going around today about Artificial Intelligence (AI) and its huge potential, but there’s also another angle to this phenomenal advancement that is rarely discussed, although it defines much: its insatiable energy consumption.

The issue isn't widely discussed, but the energy consumption of AIs, as well as other important areas, is already reshaping the race for technological leadership worldwide, while also posing an immense environmental challenge.

So much so that servers that power AIs, housed in data centers, consume colossal amounts of electricity and water, while also generating carbon emissions that threaten global climate goals.

The debate over the sustainability of AIs and its relationship to competition among tech giants is taking a critical turn because the environmental impact is not only an ethical issue, but a strategic factor that could determine who leads this race.

By 2025, the International Energy Agency (IEA) estimates that data centers, largely driven by AIs, will consume between 620 - 1,050 terawatt hours (TWh) globally, a significant increase from 460 TWh in 2022, which already accounted for 1.4% - 1.7% of global electricity consumption.

This March, a Goldman Sachs Research report revealed that global data center energy demand will increase by 50% by 2027 and by as much as 165% by the end of the decade, compared to 2023. More data centers and even more energy will be needed to meet AI's voracious appetite for this resource.

In the US, servers dedicated to AI could represent between 6.7 - 12% of total electricity consumption in the United States in the next three years, according to official figures.

By 2028, the energy demand of data centers in that northern land responsible for supporting AIs systems will reach at least 325 terawatt hours (TWh), according to a report released by the US Department of Energy. This figure will exceed the national electricity consumption recorded in 2023 by countries such as Spain (246 TWh), Italy (298 TWh), and the United Kingdom (287 TWh).

But the environmental impact goes beyond electricity. Cooling these servers consumes millions of liters of water daily. Google's data centers alone used nearly 6 billion gallons of water last year.

Carbon emissions are also skyrocketing: Microsoft, a leader in AI, left a carbon footprint in 2023 exceeding 17 million tons of CO2 and other greenhouse gases, representing a 40% increase compared to 12 million tons in 2020.

Undoubtedly, the figures compiled here show that AI's ecological footprint is growing faster than many companies can mitigate.

The environmental cost could define the winner.

In the competition for AI leadership, "winning the day" no longer depends solely on the computing power or algorithmic innovation of nations or companies, but also on how environmental and energy constraints are managed.

For example, China has invested $60 billion in hydropowered data centers, giving it an advantage in operating costs and sustainability compared to the US, where dependence on fossil fuels remains high.

In Europe, the European Union Artificial Intelligence Act (EU AI Act)—groundbreaking legislation adopted by the EU to regulate the development, and use of AI systems within its territory—requires companies to report the energy consumption of their AI models, which could force sustainable innovations.

Thus, those who fail to balance the power of their AI with a manageable environmental footprint will lose out in the competition because the inability to mitigate emissions or reliance on scarce resources like water could limit the scalability of AI, yielding an advantage to those who do. Thus, the environmental cost translates into economic and geopolitical costs.

Green and Smart Alternatives

In contrast to the above, so-called "Green AI" is emerging as an alternative that seeks to develop more efficient algorithms and hardware to reduce energy consumption without sacrificing quality performance.

In this regard, simpler and more transparent AI models are being developed, with less computational complexity, to facilitate their sustainable implementation, while AI itself is used to optimize the use of natural resources, such as energy and water, contributing to a more planet-friendly future.

CubaSí asked generative AI if it believed humanity could significantly reduce the energy consumption required by AI in the coming years, and its response was not absolute, but at least offered hope:

“The possibility of humanity significantly reducing the energy consumption required by AI in the coming years is a complex issue, but there are several factors that suggest this could be feasible.” Among them, it mentioned the use of more efficient processors and chips that require less energy to perform complex calculations, as well as optimized algorithms.

It also mentioned powering data centers with renewable energy and transferring learning, which could reduce the need to train models from scratch, saving energy.
It also pointed out the challenges: “As AI becomes more integrated into everyday life and various industries, energy demand could increase, complicating efforts to reduce overall consumption, while changing established industry practices can be a slow and challenging process.

“Although there are significant challenges, the combination of technological advances, effective policies, and increased awareness of sustainability suggests that humanity may be able to significantly reduce energy consumption associated with AI in the coming years. The key will be collaboration between governments, industries, and the scientific community to push these initiatives forward,” it concluded optimistically.

Translated by Amilkal Labañino / CubaSí Translation Staff

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.