OpenAI is looking beyond Microsoft for its cloud computing needs.
CEO Sam Altman and CFO Sarah Friar told employees about this move after the company's recent $6.6 billion funding round, sources told The Information.
Friar reportedly told shareholders that Microsoft wasn't providing enough processing power fast enough. That prompted OpenAI to explore other data center options, which its Microsoft contract allows it to do.
Altman is concerned that Microsoft won't be able to deliver servers fast enough for OpenAI to stay ahead of Elon Musk's xAI. Musk plans to release Grok 3, which he claims will be the most powerful AI model, by the end of the year. Musk's AI company xAI is building a massive server infrastructure in Memphis.
OpenAI deepens partnership with Oracle
OpenAI announced its first Oracle deal in June, in which Microsoft was only marginally involved, according to The Information's sources. Still, the deal contributes to Microsoft's Azure revenue, as OpenAI runs the Azure infrastructure on Oracle servers.
OpenAI is now negotiating with Oracle to lease an entire data center in Abilene, Texas, according to sources cited by The Information. By mid-2026, the Abilene facility could reach nearly a gigawatt of power, potentially housing hundreds of thousands of Nvidia AI chips. The data center has room to expand up to two gigawatts, if enough energy is available.
Microsoft aims to give OpenAI access to about 300,000 of Nvidia's latest GB200 graphics processors in data centers in Wisconsin and Atlanta by the end of next year. Altman has asked Microsoft to speed up the Wisconsin project, which could open partially in the second half of 2025.
OpenAI plans to use more of its own AI chips in the future to meet growing computing demands and reduce costs. The company is working with Broadcom and Marvell to design ASIC chips. OpenAI has reportedly reserved capacity for TSMC's new A16 Angstrom process, with mass production set to start in the second half of 2026.