Content
summary Summary

At Computex in Taiwan, Nvidia presented various AI products. These include Project G-Assist for gamers, Nvidia ACE for digital humans, and the RTX AI Toolkit for developers.

Ad

Nvidia announced a range of new products and updates at Computex. In addition to news about Nvidia's spread in the industry, the company also unveiled new projects for consumers.

With Project G-Assist, Nvidia showed a tech demo of a game-integrated AI assistant that provides context-based help in games like ARK: Survival Ascended. The assistant responds to voice or text input, analyzes visual information from the game, and provides tailored answers. It will support ARK players on topics like creatures, items, story, and bosses and configure the system for optimal performance.

There was also an update on ACE, Nvidia's platform for digital humans. ACE comes to RTX PCs for the first time with NIM inference microservices. NIMs enable local execution of models for speech understanding, speech synthesis, and facial animation. Nvidia showed what this could look like with a tech demo in Covert Protocol.

Ad
Ad

Nvidia also introduced the RTX AI Toolkit. This is a collection of tools and SDKs that allow developers to optimize and deploy large generative AI models for Windows PCs. With it, models can be customized, compressed, and executed up to four times faster, according to Nvidia. A central element is the AI Inference Manager SDK for integrating AI into PC applications.

Other news includes collaboration with Microsoft on GPU acceleration for small language models on the Windows Copilot Runtime as well as new RTX-accelerated AI features in apps for creatives and video editing. This also includes an optimized version of the popular ComfyUI tool for Stable Diffusion and an open-source SDK for the RTX Remix modding workflow.

The new features will be available starting in June for RTX PCs and the newly announced RTX laptops with the RTX 4000 series and up. Some tools are already available, while others will follow later in the year as a developer preview.

Spectrum-X: Nvidia's new Ethernet network platform

Nvidia has also announced the broad rollout of the Nvidia Spectrum-X Ethernet networking platform and an accelerated product release plan. According to Nvidia, Spectrum-X is the world's first Ethernet fabric for AI and will accelerate network performance for generative AI by 1.6 times compared to conventional Ethernet fabrics.

AI cloud providers like CoreWeave, Lambda, and Scaleway are among the first to use Spectrum-X. Nvidia's CEO Jensen Huang promised regular updates for Spectrum-X to further accelerate networking.

Recommendation

Nvidia also announced an expansion of its Nvidia-certified systems program to make it easier for customers to deploy AI workloads. New certifications include "Nvidia Spectrum-X Ready" for AI in the data center and "Nvidia IGX" for AI at the edge.

Nvidia's robotics platform finds new partners, NIM microservices free for developers

Huang also announced that leading companies in the robotics industry such as BYD Electronics, Siemens, Teradyne Robotics, and Intrinsic (Alphabet) are using the Nvidia Isaac platform for developing the next generation of AI-powered autonomous machines and robots.

The Isaac platform includes Nvidia-accelerated libraries, AI models, and simulation technologies. It enables robot manufacturers to integrate their technology stacks. Nvidia plans to continuously develop its own robotics stacks, including Omniverse for simulation applications, foundation models for humanoid robots, and the Jetson Thor robotics computer.

Huang also demonstrated robots for transportation, healthcare, and industry. Foxconn showed a fully simulated autonomous factory in Nvidia Omniverse with AI robots from Nvidia partners.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

Nvidia also provided an update on the NIM inference microservice framework, which aims to simplify AI application development. NIM offers optimized containers for over 40 AI models, including Llama 3, Gemma, and Mistral, and reduces development time from weeks to minutes, according to the company.

According to Nvidia, more than 150 partners are already integrating NIM to accelerate enterprise and industry-specific applications, including Amdocs, Foxconn, Lowe's, Pegatron, and Siemens. NIM microservices are available through Nvidia AI Enterprise and, effective immediately, are free for Nvidia Developer Program members for research and testing.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • At Computex, Nvidia showcased several AI products for gamers, developers, and businesses, including Project G-Assist for in-game contextual assistance, ACE for digital humans on RTX PCs, and the RTX AI Toolkit for optimizing generative AI models.
  • The new Spectrum-X Ethernet networking platform is designed to accelerate network performance for generative AI by 1.6x and is already being used by AI cloud providers such as CoreWeave, Lambda, and Scaleway. Nvidia is also expanding its certification program for AI workloads.
  • Leading companies in the robotics industry rely on Nvidia's Isaac platform to develop AI-powered autonomous machines and robots. The NIM inference microservices framework is designed to simplify the development of AI applications and is available at no cost to members of the Nvidia Developer Program.
Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.