AI in practice

Nvidia sees $600 billion AI market in the long term - and itself at the forefront

Matthias Bastian

Midjourney prompted by THE DECODER

Nvidia's big bet on AI seems to be paying off: Nvidia executive Manuvir Das sees his company at the forefront of hardware and software for AI training and inference.

Manuvir Das, head of enterprise computing at NVIDIA, estimates that the AI market will be worth $600 billion in the future. Generative AI will account for $150 billion of that, and Omniverse enterprise software will account for another $150 billion.

He expects the largest market to be among the shovel sellers for AI gold-diggers, i.e., chip manufacturers such as Nvidia. Of the $600 billion, Nvidia expects $300 billion to come from chips for AI training and inference alone.

This is a "long-term market opportunity," according to Das, who sees Nvidia as an 80 percent software company that makes the most of its hardware for AI training and inference through its software.

That would put Nvidia in all three of the connected markets mentioned above: computing power, generative AI, and universal enterprise software that makes both usable in everyday work processes.

"Accelerated Computing:" Same compute, but far more output

Nvidia sees the origin of the AI trend in "accelerated computing". Computing power is becoming increasingly important as more of the business world goes digital.

Instead of adding more and more traditional data centers and computing power, which Das says is simply not sustainable, there needs to be better use of existing capacity.

"And what we saw was accelerated computing, which is this way where the same amount of footprint can do 10x the work, 100x the work, that was going to be the only way," Das says. Generative AI, he says, is the first killer app for this new form of computing paradigm.

Because AI models are only as good as the data they are trained on, Das envisions large companies running their own "AI factories" where new models are constantly trained on current data.

He also sees Nvidia as well positioned when it comes to running AI systems, known as inference. Most computers, servers, and workstations will be used for inference in the future, Das says, emphasizing that Nvidia has the best combination of hardware and software for the job.