Ad
Skip to content

OpenAI reportedly turns to Oracle, as Microsoft can't meet its surging AI compute needs

Image description
Midjourney prompted by THE DECODER

Key Points

  • OpenAI is looking beyond Microsoft for cloud computing power. CEO Sam Altman worries Microsoft can't provide servers fast enough to stay ahead of Elon Musk's xAI.
  • Talks are underway with Oracle to lease an entire data center in Abilene, Texas. This facility could reach nearly 1 gigawatt of power by mid-2026, potentially housing hundreds of thousands of Nvidia AI chips.
  • OpenAI plans to develop its own AI chips to meet growing computing demands and reduce costs. The company is working with Broadcom and Marvell on chip design, and has reportedly reserved capacity with TSMC for chip production.

OpenAI is looking beyond Microsoft for its cloud computing needs.

CEO Sam Altman and CFO Sarah Friar told employees about this move after the company's recent $6.6 billion funding round, sources told The Information.

Friar reportedly told shareholders that Microsoft wasn't providing enough processing power fast enough. That prompted OpenAI to explore other data center options, which its Microsoft contract allows it to do.

Altman is concerned that Microsoft won't be able to deliver servers fast enough for OpenAI to stay ahead of Elon Musk's xAI. Musk plans to release Grok 3, which he claims will be the most powerful AI model, by the end of the year. Musk's AI company xAI is building a massive server infrastructure in Memphis.

Ad
DEC_D_Incontent-1

OpenAI deepens partnership with Oracle

OpenAI announced its first Oracle deal in June, in which Microsoft was only marginally involved, according to The Information's sources. Still, the deal contributes to Microsoft's Azure revenue, as OpenAI runs the Azure infrastructure on Oracle servers.

OpenAI is now negotiating with Oracle to lease an entire data center in Abilene, Texas, according to sources cited by The Information. By mid-2026, the Abilene facility could reach nearly a gigawatt of power, potentially housing hundreds of thousands of Nvidia AI chips. The data center has room to expand up to two gigawatts, if enough energy is available.

Microsoft aims to give OpenAI access to about 300,000 of Nvidia's latest GB200 graphics processors in data centers in Wisconsin and Atlanta by the end of next year. Altman has asked Microsoft to speed up the Wisconsin project, which could open partially in the second half of 2025.

OpenAI plans to use more of its own AI chips in the future to meet growing computing demands and reduce costs. The company is working with Broadcom and Marvell to design ASIC chips. OpenAI has reportedly reserved capacity for TSMC's new A16 Angstrom process, with mass production set to start in the second half of 2026.

Ad
DEC_D_Incontent-2

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

Source: The Information