Ad
Skip to content

OpenAI renting Google TPUs sends a strong warning shot to Microsoft

Image description
Sora prompted by THE DECODER

OpenAI has started using Google's Tensor Processing Units (TPUs) to run ChatGPT and other AI products, according to The Information. This marks the first time OpenAI is relying on chips beyond Nvidia's graphics processors at scale.

The TPUs are rented through Google Cloud and are aimed at lowering the costs of inference - the process of running trained models to generate answers for new prompts. Until now, OpenAI has been one of Nvidia's biggest customers, using its GPUs for both training and deploying large language models.

The partnership with Google has limits. According to The Information, Google is not giving OpenAI access to its most powerful TPU models. A Google Cloud employee confirmed this restriction.

A message to Microsoft

The move sends a clear signal to Microsoft, OpenAI's largest investor and the company that has been providing much of the infrastructure for OpenAI's products. By shifting some workloads onto Google's infrastructure, OpenAI is using its relationship with a key Microsoft competitor as strategic leverage.

Ad
DEC_D_Incontent-1

OpenAI CEO Sam Altman and Microsoft CEO Satya Nadella are reportedly in ongoing talks about the companies' partnership. OpenAI has also expanded its compute capacity through a deal with Oracle.

Google Cloud competes directly with Microsoft Azure and is a major growth driver for Google. OpenAI's decision to use Google's infrastructure impacts not just the AI market, but also the cloud sector, which has been central to Microsoft's stock performance in recent years.

Google originally kept its TPUs for internal use, but is now opening them up to more outside partners. Besides OpenAI, customers include Apple and the startups Anthropic and Safe Superintelligence, both founded by former OpenAI executives.

According to research firm Epoch AI, Google's infrastructure gives it the world's largest AI computing capacity. The TPU cloud is a core part of Google's AI strategy and goes head to head with Nvidia's GPUs, especially for running large models

Ad
DEC_D_Incontent-2

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

AI news without the hype
Curated by humans.

  • Over 20 percent launch discount.
  • Read without distractions – no Google ads.
  • Access to comments and community discussions.
  • Weekly AI newsletter.
  • 6 times a year: “AI Radar” – deep dives on key AI topics.
  • Up to 25 % off on KI Pro online events.
  • Access to our full ten-year archive.
  • Get the latest AI news from The Decoder.
Subscribe to The Decoder