AI Could Outpace Bitcoin in Power Consumption by End of 2025

The global AI industry is fast approaching an energy milestone that could reshape both power infrastructure and climate policy. New research shows that by the end of 2025, the electricity consumed by advanced AI systems may exceed the notorious energy demands of Bitcoin mining.

As generative AI continues to explode in popularity, it’s driving an unprecedented boom in data center construction and chip production. Specialized AI accelerators from companies like Nvidia and AMD have become central to this growth—along with their considerable appetite for power. AI workloads, which currently account for about 20% of total data center energy use, are projected to consume nearly 50% by next year.

This rapid expansion has been closely analyzed by Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam. His findings, published in Joule, estimate AI’s energy demand based on public hardware specs, corporate disclosures, and industry forecasts. Since most companies don’t share exact energy usage data, de Vries-Gao pieced together the puzzle using supply chain insights and chip production data—especially from TSMC, the world’s leading chip manufacturer.

The energy demands are staggering. A single Nvidia H100 chip can draw 700 watts continuously. Multiply that by millions, and the impact is immense. De Vries-Gao estimates that just the AI hardware produced in 2023 and 2024 could eventually consume between 5.3 and 9.4 gigawatts—more than the entire national grid of Ireland.

And the surge is only beginning.

TSMC’s CoWoS packaging technology, which allows AI chips to pack in faster memory and more processing power, is at the heart of this revolution. TSMC more than doubled its CoWoS capacity in 2023–2024, yet still couldn’t keep up with demand. The company plans to double capacity again in 2025. If this trajectory holds, AI energy usage could reach 23 gigawatts by year’s end—about the same as the UK’s entire electricity consumption.

This would place AI ahead of global Bitcoin mining in terms of power draw. According to the International Energy Agency, AI-driven data centers could double their electricity use within just two years, even with ongoing improvements in efficiency.

While strides have been made in using greener energy and optimizing infrastructure, these efforts are being eclipsed by the sheer scale of AI’s growth. The industry’s push for bigger and more complex models fuels a cycle of ever-growing resource needs. Even as individual data centers become more efficient, overall consumption continues to rise.

Complicating matters further is a manufacturing bottleneck. New AI chips require increasingly advanced packaging, such as TSMC’s CoWoS-L, which currently suffers from low production yields.

At the same time, major tech firms like Google are warning of power shortages as they race to build more data centers. Some have even turned to repurposed fossil fuel plants, with one project locking in 4.5 gigawatts of natural gas to power AI operations.

The climate impact of this growth depends heavily on location. AI facilities in regions reliant on fossil fuels produce far more carbon emissions than those using renewable energy. For instance, a data center in coal-heavy West Virginia can emit nearly twice as much CO₂ as one in solar- and wind-powered California.

As AI continues to expand, the question isn’t just how fast it will grow—but how sustainably it can evolve.

Source

Control F5 Team
Blog Editor
OUR WORK
Case studies

We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.

READY TO DO THIS
Let’s build something together