Content
summary Summary

Google likely has the world's largest AI computing capacity, according to an analysis by AI research firm Epoch AI.

Ad

The tech giant's edge comes from its custom-built Tensor Processing Units (TPUs), which provide computing power equivalent to at least 600,000 Nvidia H100 GPUs. "Given this large TPU fleet, combined with their NVIDIA GPUs, Google probably has the most AI compute capacity of any single company," Epoch AI researchers noted.

Nvidia dominates the AI chip market

Despite Google's lead in total capacity, Nvidia remains the dominant player in AI chip sales. Since early 2022, Nvidia has sold AI chips with computing power equal to about 3 million H100 GPUs. Most went to four major cloud providers, with Microsoft being the largest customer.

Other buyers include cloud companies like Oracle and CoreWeave, AI firms such as xAI and Tesla, Chinese tech companies, and governments building national AI infrastructure.

Ad
Ad
Bar chart: AI computing capacity of leading technology companies, Google and "Other" leading, based on NVIDIA revenue and TPU shipments.
The chart shows the estimated availability of AI processing power among leading tech companies. Google dominates thanks to its own TPUs. This distribution could have a significant impact on the future development and application of AI models. | Image: via EpochAI

Epoch AI cautions that its estimates have significant uncertainty and that the AI chip landscape is rapidly evolving as companies develop new processors and ramp up production.

Nvidia's Blackwell generation reportedly already sold out

Nvidia's upcoming Blackwell GPUs are reportedly in high demand. According to Barron's, citing conversations between Morgan Stanley analysts and Nvidia CEO Jensen Huang, the next 12 months of Blackwell supply is already sold out.

Beyond Nvidia and Google, other players like AMD, Intel, Huawei, Amazon, Meta, OpenAI, and Microsoft are developing their own AI chips. China is reportedly encouraging domestic firms to buy more chips from Chinese suppliers like Huawei.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Google likely has the world's largest AI computing capacity thanks to its proprietary Tensor Processing Units (TPUs), which have the equivalent performance of at least 600,000 Nvidia H100 GPUs, according to an analysis by Epoch AI.
  • Despite Google's compute advantage, Nvidia remains the dominant player in the AI chip market, having sold AI chips with a combined compute power of about three million H100 GPUs since the beginning of 2022.
  • Nvidia's main customers include four major hyperscalers such as Microsoft, cloud providers, companies, Chinese technology groups, and governments.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.