AI in practice

Nvidia is building "AI factories" and its next-generation Blackwell GPU is designed to scale AI.

Matthias Bastian
Illustration of an imaginative AI factory

Midjourney prompted by THE DECODER

Nvidia is developing a new type of data center that CEO Jensen Huang calls an "AI factory." The new generation of GPUs, called Blackwell, is designed to enable AI scaling.

Unlike traditional data centers, where many users share a cluster of computers, an AI factory will be "much more like a power generator," Huang said. Nvidia has been working on this for the past few years and now wants to turn the technology into a product.

The new and, according to Huang, "quite unique" form of data center does not yet have an official name. Huang tells Wired that it will be ubiquitous in the future and will be built by both cloud service providers and Nvidia itself. Unfortunately, he does not describe the capabilities of the "AI factory."

"We looked at the way Moore’s law was formulated, and we said, 'Don’t be limited by that. Moore’s law is not a limiter to computing.'"

These AI factories will be used by cloud service providers, biotechnology companies, retailers, logistics companies, and car manufacturers. As an example, Huang cites a car factory that will have an AI factory in addition to factories that produce car parts. Huang points to Elon Musk as a pioneer in this area, as he is already focused on combining physical manufacturing with AI development.

Demand for Nvidia cards still outstrips production

Huang does not expect Nvidia to fully meet demand for its graphics cards this year. He also thinks it is unlikely next year.

According to Huang, Nvidia's next generation of GPUs, codenamed "Blackwell," will be released this year with "off-the-chart" performance.

The new cards are also designed to reduce the cost of training AI models, allowing AI companies to scale them.

"The goal is to reduce the cost of training models tremendously. Then people can scale up the models they want to train," Huang said.

Massive scaling is seen by many in the AI industry as the driving force behind AI's progress, along with other architectural advances.

However, it remains a challenge, in part because of the high cost of training and inferencing large AI models and the low availability of hardware. Reducing hardware and operating costs would be a major step forward in scaling AI.

Sources: