According to TheRegister.com, Turner & Townsend’s 2025-2026 Datacenter Construction Cost Index reveals massive power constraints threatening AI infrastructure growth. The survey of 300+ projects across 20+ countries with input from 280 experts shows 48% of respondents cite power access as the biggest scheduling constraint, with US grid connection wait times stretching to seven years. OpenAI’s disclosed projects alone would consume 55.2 gigawatts—enough to power 44.2 million households, nearly triple California’s housing stock. Deloitte warned AI datacenter power needs in the US may be 30 times greater within a decade, while 83% of professionals believe local supply chains can’t support advanced cooling technology for AI deployments.
The power crunch is real
Here’s the thing—we’ve been talking about AI’s energy appetite for a while, but these numbers are staggering. Seven-year wait times for grid connections? That’s basically telling companies “maybe you’ll get power by 2032.” And we’re not talking about small projects either—OpenAI’s planned infrastructure would use more electricity than entire states. The competition for power between datacenters, housing, and manufacturing is creating a zero-sum game where someone’s going to lose. And honestly, when governments have to choose between keeping lights on in homes versus powering AI models, which way do you think they’ll lean?
The cooling problem nobody saw coming
What’s really interesting is that 83% of professionals don’t think supply chains can handle the cooling demands. Traditional air-cooled datacenters are one thing, but AI facilities require liquid cooling that’s 7-10% more expensive. We’re talking about specialized equipment that simply doesn’t exist at scale yet. So even if you solve the power problem, you might not be able to keep the chips from melting. It’s like building a Formula 1 car but discovering nobody makes tires that can handle the speed.
Time for a reality check
Look, the AI hype train has been moving at light speed, but physics and infrastructure don’t care about venture capital timelines. Turner & Townsend’s report suggests on-site generation and energy storage as solutions, but let’s be real—most of that will likely mean gas-powered generators. So much for green AI. And chip supply constraints? That’s another bottleneck waiting to happen. Basically, we’re building the plane while flying it, and there might not be enough fuel to reach the destination.
What happens when the music stops?
The report’s authors aren’t mincing words—investment is at risk. Paul Barry from Turner & Townsend basically said what everyone’s thinking: we’re in uncharted territory. AI datacenters are more advanced, costlier, and hungrier than anything we’ve built before. And they’re competing with everyday energy needs during a time when grids are already stressed. So here’s the billion-dollar question: how many of these planned AI megaprojects will actually get built? My guess is far fewer than the hype suggests. The infrastructure just isn’t there to support them.
