AI’s power problem is way bigger than we thought

AI's power problem is way bigger than we thought - Professional coverage

According to TheRegister.com, Turner & Townsend’s 2025-2026 Datacenter Construction Cost Index reveals that power access is now the biggest scheduling constraint for 48% of datacenter professionals, with grid connection wait times stretching up to seven years in the US and requiring substation upgrades worth hundreds of millions in Britain. The report surveyed 300+ projects across 20+ countries with input from 280 industry experts, finding that OpenAI’s disclosed projects alone would consume 55.2 gigawatts—enough to power 44.2 million households, nearly triple California’s housing stock. Meanwhile, 83% of professionals believe local supply chains can’t support advanced cooling technology for AI deployments, and AI-optimized liquid-cooled facilities cost 7-10% more than air-cooled designs. Deloitte warned in June that AI datacenter power needs in the US may be 30 times greater within a decade, with 5 GW facilities already in planning.

Special Offer Banner

Sponsored content — provided for informational and promotional purposes.

Power grid reality check

Here’s the thing that nobody in the AI hype cycle wants to admit: we’re trying to plug industrial-scale power demands into grids that were built for a different era. Seven-year wait times for grid connections? That’s basically telling AI companies “maybe try again in 2032.” And we’re not talking about small power draws here—OpenAI’s planned projects alone would consume enough electricity for 44 million homes. That’s insane when you think about it.

But the real problem is that datacenters are competing with everything else that needs power—housing, manufacturing, electric vehicles. There’s only so much juice to go around, and suddenly everyone wants a bigger sip from the same cup. Turner & Townsend’s full report recommends on-site generation and energy storage, but let’s be real: when they talk about “renewables” for energy creation, we all know what’s actually going to happen—gas-powered generators. So much for green AI.

Supply chain crunch

And then there’s the cooling problem. 83% of professionals don’t think local supply chains can handle the advanced cooling tech needed for AI. Traditional air-cooled facilities are seeing cost increases, but the real jump comes with liquid-cooled designs at 7-10% higher costs. We’re building a whole new category of infrastructure here, and the supply chains just aren’t ready.

Paul Barry from Turner & Townsend nailed it when he said AI datacenters are “more advanced, and by extension, costlier.” But that’s putting it mildly. We’re talking about facilities that need specialized cooling, massive power draws, and custom hardware—all while competing for limited construction resources and facing years-long delays. Something’s gotta give.

Chip reality

Oh, and let’s not forget the hardware side. The report mentions chip makers might struggle to produce enough supply to support this massive buildout. So even if you somehow solve the power problem and the cooling problem and the construction problem… you might not have enough chips to actually run the AI models. It’s like building the world’s most advanced kitchen only to discover there’s a global spatula shortage.

Basically, we’re witnessing the collision between AI’s exponential growth curves and the physical world’s very linear constraints. Power grids don’t scale like software. Supply chains don’t move at startup speed. And seven-year wait times don’t fit well with quarterly earnings reports. The AI boom is about to meet its first real physical limit, and it’s going to be messy.

Leave a Reply

Your email address will not be published. Required fields are marked *