In 2050, artificial intelligence could consume more energy than France today

Deal Score0
Deal Score0

Two pickaxes attacking a mountain of hydrocarbons

Luke Conroy and Anne Fehres & AI4Media / Better Images of AI / Models Built From Fossils /

© CC-BY 4.0

Among the many problems facing AI today, one of the main ones is undoubtedly its enormous energy cost. The data centers and the greedy graphics cards that run ChatGPT and others are already causing a lot of debate, as much for their use of water than in electricity. And the trend shows no signs of slowing down.

Advertising, your content continues below

A third more energy than current French consumption

According to a study published by the research firm Deloitte And spotted by NextInpactfollowing the current rate of adoption, AI could require no less than 3,550 terawatt hours of energy in 2050. In 25 short years, consumption in the artificial intelligence sector could be multiplied by nine compared to today today (the same study postulates that the industry has swallowed up 382 terawatt hours in 2023).

A graph showing AI energy consumption until 2050 peaking at 3550 TWh in the worst case

Deloitte's projections to 2050

© Deloitte

To put these figures in context, this would represent 37% more energy than that consumed by the whole of France in 2023. The study also specifies that these figures take into account “improving the energy efficiency of data centers“. What “directly impact electricity suppliers and challenge countries' ambitions in terms of climate neutrality” if nothing is done.

Advertising, your content continues below

The wall of reality

Deloitte nevertheless specifies that these figures come from a maximalist projection in terms of use, which would see the pace of development continue on the same line as currently. A scenario “reference“which would consist of”limit AI deployment to the simplest and most profitable applications” would limit consumption to 1680 terawatt hours in 2050.

For less distant deadlines, the projections are hardly more reassuring. According to a study carried out this time by Morgan StanleyAI energy costs could grow by 70% per year by the end of the decade. “By 2027, generative AI could consume as much energy as Spain in 2022“, notes the firm.

The significant increase in energy requirements is not well understood by the market and has not been factored into the pricing of a number of stocks

Morgan Stanley

Morgan Stanley itself recognizes this, “we were surprised by our own projections“. The specialist on the subject within the company even allows himself a warning, noting that “The significant increase in energy requirements is not well understood by the market and has not been factored into the pricing of a number of shares.

And what about waste?

The carbon footprint is not limited to the energy consumption of data centers alone. The production and renewal of components necessary for improving data centers must also be taken into account.

According to a study published in the journal Nature at the end of 2023the renewal of processors and other graphics cards could generate 2.5 million tonnes of electronic waste per year from 2030 if no mitigation measures are taken. This is the equivalent of 13 billion iPhone 15 Pros thrown in the trash. Enough to put the use of ChatGPT and the hype around the long-awaited new language models into perspective.

Advertising, your content continues below

More Info

We will be happy to hear your thoughts

Leave a reply

Bonplans French
Logo