AI Energy Consumption Projected to Surpass Bitcoin Mining by 2025

·

A recent study warns that artificial intelligence's electricity consumption could exceed that of Bitcoin mining by the end of 2025. According to research published in the scientific journal Joule, AI may account for nearly half of the total energy usage of global data centers within this timeframe.

The analysis, led by Alex de Vries-Gao, a PhD researcher at the Vrije Universiteit Amsterdam, highlights the rapidly growing energy demands of AI-specific hardware. By tracking the supply chain of specialized AI chips, the study reveals an alarming acceleration in power requirements.

Understanding the Scale of AI’s Energy Appetite

The research focused on the production of AI accelerator modules manufactured by NVIDIA and AMD between 2023 and 2024. The cumulative thermal design power (TDP) of these units reached 3.8 gigawatts—equivalent to the annual electricity consumption of Ireland.

This surge is largely driven by advancements in semiconductor packaging. TSMC’s CoWoS (Chip-on-Wafer-on-Substrate) technology, crucial for modern AI accelerators, allows processors and high-bandwidth memory to be integrated into a single package. This innovation helps overcome the "memory wall" problem but comes with significant energy costs.

Analysts estimate that TSMC’s CoWoS production capacity reached approximately 126,500 300-millimeter wafers in 2023. By 2024, that number more than doubled to 327,400 wafers, signaling massive industry expansion.

Projected Energy Consumption and Global Impact

De Vries-Gao’s model projects that AI systems from NVIDIA and AMD alone could require up to 5.3 gigawatts of power in 2025. When including other manufacturers utilizing TSMC’s CoWoS capacity, total energy demand could reach 9.4 gigawatts.

This would translate to an annual consumption between 46 and 82 terawatt-hours—comparable to the total yearly electricity usage of countries like Switzerland, Austria, or Finland.

The study draws a parallel between AI and cryptocurrency mining, noting that both industries operate under a "bigger is better" mindset. This philosophy encourages continuous scaling, which in turn drives energy demand upward despite improvements in hardware efficiency.

Innovation as a Potential Counterbalance

While the current trajectory seems concerning, the research also identifies potential mitigating factors. Innovations in both hardware and software could help curb energy growth.

For example, DeepSeek’s R1 model—developed in China—claims to achieve performance comparable to ChatGPT using lower-tier hardware and novel software techniques. Such approaches demonstrate that computational and energy efficiency can be improved without sacrificing capability.

Still, the lack of transparency from major tech companies remains a barrier to accurate assessment. Google, for instance, has discontinued its previous practice of disclosing AI-related energy usage—a move that complicates independent evaluation.

De Vries-Gao emphasizes the need for stricter regulatory frameworks that mandate public reporting of AI energy consumption. Greater transparency would enable policymakers and researchers to better understand and manage the environmental impact of AI technologies.

👉 Explore more strategies for sustainable technology

Frequently Asked Questions

Why is AI’s energy consumption growing so rapidly?
AI models, particularly large language models, require immense computational power for both training and inference. As models grow in size and complexity, their energy demands increase—even if hardware becomes more efficient.

How does AI energy use compare to other industries?
By the end of 2025, AI is expected to consume more electricity than Bitcoin mining. It may represent up to half of all data center energy usage, placing it among the most energy-intensive digital technologies.

Can AI become more energy-efficient in the future?
Yes. Improvements in algorithms, specialized hardware, and model optimization can reduce energy needs. Some companies are already developing leaner systems that deliver high performance with lower power consumption.

What role can policymakers play in managing AI’s energy impact?
Governments can introduce regulations requiring energy transparency, set efficiency standards, and incentivize the development of low-power AI systems. Public reporting is a critical first step.

Are renewable energy sources a solution for AI’s power demand?
While transitioning to renewables can reduce carbon emissions, the sheer scale of AI’s energy use remains a challenge. Efficiency improvements and responsible scaling are also necessary.

How can companies contribute to more sustainable AI practices?
Tech firms can prioritize energy-efficient model design, invest in green data centers, and openly report environmental metrics. Industry collaboration is key to setting sustainable standards.