The rise of artificial intelligence is reshaping everything from business strategy to national security — but behind the algorithms and data sets lies a mounting question: can the world’s energy grids cope with the exploding power demands of AI infrastructure? Experts warn that without urgent investment, the energy cost of the AI boom could overwhelm existing systems.
AI models, particularly large-scale systems like those used in language generation, image synthesis, or autonomous decision-making, require vast amounts of computational power. This, in turn, demands energy-intensive data centres equipped with high-performance GPUs, cooling systems and uninterrupted power supplies. As the scale of AI training and deployment accelerates, power usage is climbing rapidly, often outpacing projections.
According to the International Energy Agency (IEA), electricity demand from global data centres could double between 2022 and 2026, driven primarily by AI and crypto-related workloads. In the US alone, data centres already consume around 2.5% of national electricity use, and that figure is expected to rise significantly if AI adoption continues at the current pace.
In places like Ireland and northern Virginia — key data centre hubs — grid operators are sounding the alarm. EirGrid has warned of potential supply crunches, with energy constraints already delaying new data centre connections. Meanwhile, in parts of the UK and continental Europe, regulators are urging a rethink of planning laws and energy priorities to accommodate the surge in demand.
“There’s a real risk that AI’s growth becomes bottlenecked by infrastructure,” says Dr Linda Kerr, an energy systems analyst based in London. “Not just in terms of electricity supply, but also in cooling, land use and sustainability metrics.”
Tech giants, including Microsoft, Google and Amazon, are scrambling to lock in future energy capacity through long-term power purchase agreements (PPAs), particularly with renewable providers. Yet this creates its own tensions: critics argue that AI’s massive footprint risks crowding out other industries and communities in need of clean energy access.
The pressure is also forcing changes in how data centres are designed and operated. Liquid cooling, once niche, is becoming mainstream. Modular data centres that can be located closer to renewable energy sources are being deployed. Companies are experimenting with AI itself to improve energy efficiency, though this, paradoxically, adds another computational layer.
Governments are beginning to take notice. In the US, the Department of Energy has launched a task force on digital energy demand. China has moved to regulate data centre development more strictly, while the EU is considering emissions thresholds and reporting requirements for AI infrastructure under its Green Deal framework.
Some in the sector argue the fears are overblown, pointing to advances in chip efficiency and the growing share of renewables in the power mix. But many experts caution that without systemic change — including massive upgrades to transmission networks, storage capacity and demand-response systems — the AI revolution could become a strain rather than a solution.
As one analyst put it: “The intelligence may be artificial, but the energy needs are very real.”
REFH – newshub finance

Recent Comments