According to Manufacturing.net, OpenAI and Amazon have signed a massive $38 billion deal that will let the ChatGPT maker run its artificial intelligence systems on Amazon’s U.S. data centers. The agreement gives OpenAI access to “hundreds of thousands” of Nvidia’s specialized AI chips through Amazon Web Services, with all capacity expected to be deployed before the end of 2026 and potential expansion into 2027. Amazon’s shares jumped 4% immediately after Monday’s announcement. This comes less than a week after OpenAI altered its partnership with longtime backer Microsoft, which had been its exclusive cloud provider until early this year. California and Delaware regulators also recently approved OpenAI’s plan to form a new business structure to more easily raise capital and pursue profits.
<h2 id="cloud-wars”>The Cloud Provider Shuffle
Here’s what’s really interesting about this timing. OpenAI just restructured its Microsoft partnership, and now they’re diving headfirst into a $38 billion deal with Amazon? That’s not just diversification—that’s a strategic power play. Microsoft must be watching this very carefully, even if they’re putting on a brave face. Sam Altman appearing on a podcast with Satya Nadella last week suddenly looks like some serious relationship management theater.
And let’s talk about that $38 billion number. That’s not pocket change, even for companies swimming in AI hype. OpenAI has apparently made over $1 trillion worth of financial obligations for AI infrastructure between this, their deals with Oracle and SoftBank, and chip supply agreements with Nvidia, AMD, and Broadcom. That’s an insane amount of forward spending for a company that, let’s be real, isn’t exactly printing money yet.
The Burning Cash Question
So how does this math work? Altman says revenue is “growing steeply” and they’re taking a “forward bet” that it will continue. But investors are getting nervous about what they’re calling “circular” deals—where cloud providers front the infrastructure expecting future returns from a company that can’t currently afford to pay for it. It’s basically the entire AI industry betting on itself in one massive circle.
Meanwhile, Amazon is playing both sides of the AI war. They’re already the primary cloud provider to Anthropic, OpenAI’s biggest competitor with the Claude chatbot. Now they’re also powering OpenAI? Amazon wins either way, collecting rent while the AI companies duke it out. Smart business, honestly.
The Infrastructure Arms Race
The sheer scale of computing power needed here is staggering. “Hundreds of thousands” of Nvidia chips? That’s enough to make even the most hardened data center operator sweat. AI is turning into an energy and computing monster, and companies are scrambling to feed the beast.
What happens if the revenue growth Altman’s betting on doesn’t materialize fast enough? Or if the next generation of AI models requires even more computing power than anyone anticipated? We’re looking at some serious financial exposure here. But then again, maybe that’s why OpenAI needed those regulatory approvals for new profit-seeking models last week. They’re clearly preparing for something big—and they’re going to need to pay for all this infrastructure somehow.
