Ad
Short

Nvidia used the NeurIPS conference to debut new AI models for autonomous driving and speech processing. The company introduced Alpamayo-R1, a system designed to handle traffic situations through step-by-step logical reasoning. Nvidia says this approach helps the model respond more effectively to complex real-world scenarios than previous systems. The code is public, but the license limits it to non-commercial use.

Nvidia also showed new tools for robotics simulation. In speech AI, the company unveiled MultiTalker, a model that can separate and transcribe overlapping conversations from multiple speakers.

Short

Google is in talks with Meta and several other companies about letting them run Google's TPU chips inside their own data centers, according to a report from The Information. One person familiar with the discussions said Meta is considering spending billions of dollars on Google TPUs that would start running in Meta facilities in 2027. Until now, Google has only offered its TPUs through Google Cloud.

The new TPU@Premises program is Google's attempt to make its chips a more appealing alternative to Nvidia's AI hardware. According to someone with knowledge of internal comments, Google Cloud executives have said the effort could help the company reach ten percent of Nvidia's annual revenue. Google has also built new software designed to make TPUs easier to use.

Ad
Ad
Short

Arm and Nvidia plan closer collaboration. Arm says its CPUs will be able to connect directly to Nvidia's AI chips using NVLink Fusion, making it easier for customers to pair Neoverse CPUs with Nvidia GPUs. The move also opens Nvidia's NVLink platform to processors beyond its own lineup.

The partnership targets cloud providers like Amazon, Google, and Microsoft, which increasingly rely on custom Arm chips to cut costs and tailor their systems. Arm licenses chip designs rather than selling its own processors, and the new protocol speeds up data transfers between CPUs and GPUs. Nvidia previously tried to buy Arm in 2020 for 40 billion dollars, but regulators in the United States and the United Kingdom blocked the deal.

Short

According to the Wall Street Journal, Amazon, Microsoft, and AI startup Anthropic are backing a US law that would further restrict Nvidia's chip exports to China. The proposed Gain AI Act would require semiconductor companies to satisfy US demand first before shipping chips to countries under arms embargos. The law would give tech giants like Amazon and Microsoft priority access to chips.

Nvidia opposes the plan, warning it would create unnecessary market interference. Some government officials question whether the law is even needed, pointing out that the Commerce Department already has the authority to enforce export controls. Meta and Google haven't commented on the proposal. The Gain AI Act could be attached to the defense budget as an amendment.

Ad
Ad
Short

Nvidia turns to synthetic data to tackle robotics’ biggest challenge: the lack of training data.

"We call this the big data gap in robotics," a Nvidia researcher said at the Physical AI and Robotics Day during GTC Washington. While large language models train on trillions of internet tokens, robot models like Nvidia’s GR00T have access to only a few million hours of teleoperation data, gathered through complex manual effort - and most of it is narrowly task-specific.

Nvidia’s answer is to rethink what it calls the "data pyramid for robotics." At the top sit real-world data - small in quantity and expensive to collect. In the middle lies synthetic data from simulation - theoretically limitless. At the base is unstructured web data. "When synthetic data surpasses the web-scale data, that's when robots can truly learn to become generalized for every task," the team states. With Cosmos and Isaac Sim, Nvidia aims to turn robotics’ data shortage into a compute challenge instead.

Ad
Ad
Short

Deutsche Telekom and Nvidia are joining forces to build the "Industrial AI Cloud" in Munich, set to become one of the largest AI computing hubs in Europe. The center will feature more than 1,000 NVIDIA DGX B200 systems and RTX PRO servers, with up to 10,000 NVIDIA Blackwell GPUs. According to Telekom, this will increase Germany's AI computing power by 50 percent. For comparison's sake, Sam Altman recently said that OpenAI will have "well over one million GPUs" online by the end of 2025. That's just OpenAI.

"Germany's strength in engineering and industry is legendary and will now be further enhanced by AI."

Jensen Huang, Nvidia CEO

The new initiative aims to give European companies the ability to build AI solutions using local data. Early partners are SAP, Polarise, and Agile Robots. The platform is intended to support applications such as factory simulation, robot training, and running large language models on site. The project, valued at over one billion euros, is privately funded and separate from the EU’s AI gigafactory funding program.

Google News