AI’s Power Problem Is Reshaping The Energy Map

AI’s Power Problem Is Reshaping The Energy Map
AI is not just an innovation story, it’s also an infrastructure one. The rapid scale-up of large language models is creating a new class of non-negotiable, 24/7 energy demand that’s triggering fresh questions not just about how much energy AI consumes, but how reliably and cleanly that power can be delivered.
The exact energy footprint of AI remains murky – as the Financial Times has noted – but the early signals are striking. According to data from Kanoppi, a traditional Google Search uses 0.0003kWh per query, emitting 0.2g CO₂, while the same search using ChatGPT consumes 10 times more energy and emits roughly 68g CO₂. Scaled across billions of interactions, that energy-to-emissions ratio becomes significant.
This challenge is set to grow fast. The International Energy Agency (IEA) now predicts that electricity demand from data centres worldwide will more than double by 2030 to around 945 terawatt-hours (TWh) – slightly more than the entire electricity consumption of Japan today – with AI as the primary driver. AI is also distinct from other disruptors, such as electric vehicles, as it does not decarbonize existing fossil fuel usage, but instead creates new consumption.
This has profound implications for the grid. Many regions are already grappling with transmission bottlenecks, slow interconnection queues and aging infrastructure Complicating matters, AI workloads are uniquely power-sensitive, requiring near-perfect uptime and fast response, further straining local networks. To meet these needs, data centres are often situated in rural areas where clean power and cheap land are available – but where grid capacity is often weak. This has sparked local backlash. For example, in the Netherlands, Microsoft’s data centres have been subject to farmer protests over water usage while in Ireland, AWS has restricted the number of resources users can access due to grid capacity concerns.
Tech giants are trying to keep up by scaling their renewable portfolios. For example, Google recently signed a 600MW solar deal and Microsoft has announced plans for nuclear-powered data centres. But off-site clean energy procurement alone won’t be enough, with grid congestion and interconnection delays slowing the connection of new capacity in many markets. AI-driven loads will increasingly require resilient local infrastructure, turning to microgrids, on-site generation and distributed energy resources (DERs) to manage reliability and carbon integrity.
This opens the door to new solutions. As AI-driven energy demand grows, vendors offering energy management systems, load forecasting and real-time dispatch will be critical. Some providers are also exploring decentralized compute models – such as edge AI and co-located micro data centres – to reduce grid pressure and better match distributed renewables.
AI isn’t a climate saviour, but it is now a permanent fixture in energy system design. The challenge is ensuring this new load aligns with net zero goals – and doesn’t compete with them.
For further insights to the intersection of AI and sustainability, see Verdantix Strategic Focus: Implications Of AI For The CSO.