Content
summary Summary

Nvidia achieved new record numbers in the third quarter of fiscal year 2025, with revenue rising to $35.1 billion. According to CEO Jensen Huang, the company is just at the beginning of two fundamental developments that should drive further growth in the long term.

Ad

"The tremendous growth in our business is being fueled by two fundamental trends that are driving global adoption of NVIDIA computing," Huang said during the earnings call. The first trend is the modernization of global IT infrastructure.

Huang sees a transformation of unprecedented scale: The world's trillion-dollar CPU-based IT infrastructure is being modernized to support machine learning and AI. "The computing stack is undergoing a reinvention, a platform shift from coding to machine learning, from executing code on CPUs to processing neural networks on GPUs."

Huang expects this transformation to take several years as companies worldwide retrofit their data centers. "The $1 trillion installed base of traditional data center infrastructure is being rebuilt for Software 2.0, which applies machine learning to produce AI."

Ad
Ad

AI factories for "digital intelligence"

The second fundamental trend, according to Huang, is the production of digital intelligence in "AI factories" that run around the clock. "The age of AI is in full steam. Generative AI is not just a new software capability but a new industry with AI factories manufacturing digital intelligence, a new industrial revolution that can create a multi-trillion-dollar AI industry."

Nvidia's Hopper and Blackwell architectures, along with platforms like Omniverse, play a key role in this development. Demand for Hopper chips is "exceptional," according to Nvidia, with H200 chip revenue more than doubling quarter-over-quarter. Blackwell is in mass production, with demand far exceeding supply.

Huang sees several reasons for the enormous demand: "There are more foundation model makers now than there were a year ago. The computing scale of pretraining and post-training continues to grow exponentially. There are more AI-native start-ups than ever, and the number of successful inference services is rising. And with the introduction of OpenAIs o1, a new scaling law called test time scaling has emerged. All of these consume a great deal of computing."

New markets emerge alongside cloud providers

Beyond major cloud providers, a new market is emerging with "Sovereign AI": Countries and regions are building independent AI infrastructures to meet regional requirements. According to Nvidia, India plans to increase its number of Nvidia GPUs tenfold by year's end. Japan is building one of the most powerful supercomputers with SoftBank, based on Nvidia's DGX Blackwell. European countries are also working on regional clouds and AI factories, Huang said.

OpenAI's o1 model shows new scaling dimension

Nvidia also benefits from new optimization techniques like post-training or test-time scaling, which further increase demand for computing power. Test-time scaling uses additional resources during runtime to deliver smarter answers in real-time. OpenAI uses this technique for its new o1 model. " It's a little bit like us doing thinking in our head before we answer your question," Huang said. "And as a result of that, the demand for our infrastructure is really great."

Recommendation

These trends create a new dimension of scalability, while the scalability of training foundation models remains intact, according to Huang: "As you know, this is an empirical law, not a fundamental physical law. But the evidence is that it continues to scale. What we're learning, however, is that it's not enough, that we've now discovered two other ways to scale." With this statement, Huang also addressed recent reports suggesting that training scaling is reaching its limits. This could particularly affect Nvidia, as the company is the undisputed market leader in AI training. However, in the AI inference segment, there is significantly more - albeit young - competition.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Nvidia reports record third quarter revenues of $35.1 billion. CEO Jensen Huang sees several fundamental trends that should allow his company to continue to grow: The modernisation of the global IT infrastructure and the creation of AI factories for digital intelligence.
  • The world's $1 trillion IT infrastructure is currently being transformed from CPU-based to GPU-based systems. Huang expects this transformation to take several years, as data centres around the world need to be upgraded to handle AI processing.
  • Demand for Nvidia's H200 chips has doubled quarter-on-quarter. New techniques such as test-time scaling, which OpenAI uses in its o1 model, are further driving demand for computing power. At the same time, countries like India and Japan are building their own AI infrastructures with Nvidia hardware.
Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.