Saednews: A PhD candidate at VU Amsterdam has estimated that AI data centers may consume half of all data center electricity if current growth persists.
Alex de Vries-Gao, writing in the journal Joule, conducted a study estimating past, current, and future electricity usage by AI data centers.
The International Energy Agency recently reported that data centers accounted for up to 1.5% of global energy use in 2024, a figure that is rapidly increasing.
De Vries-Gao noted that data centers support more than AI queries, including cloud storage and bitcoin mining.
AI developers have acknowledged the heavy computing power required to run large language models like ChatGPT.
Some companies have started generating their own electricity to meet demand.
However, over the past year, AI firms have become less transparent about their energy use.
De Vries-Gao therefore estimated electricity consumption based on publicly available data.
He analyzed chips produced by Taiwan Semiconductor Manufacturing Company, which supplies Nvidia and others.
He combined estimates from analysts, earnings reports, hardware sales, and electricity consumption reports for AI hardware.
Using this data, he calculated that AI providers will consume about 82 terawatt-hours of electricity in 2025, roughly equal to Switzerland’s total power use.
Assuming AI demand doubles by year-end, AI data centers could consume about half of global data center electricity.
De Vries-Gao warned that the rise in AI power use not only risks increasing electricity prices but also environmental harm.
“If most AI providers use grid electricity, greenhouse gas emissions could rise sharply due to coal-based power generation,” he said.
This increase could accelerate global warming.