Content
summary Summary

The gap between simulated and real-world robot movements has always been a major hurdle in robotics. Now, researchers at Nvidia GEAR Lab and Carnegie Mellon University have developed a framework that helps bridge this divide.

Ad

Their system, called ASAP (Aligning Simulation and Real Physics), cuts down motion errors between simulated and real movements by about 53 percent compared to existing methods. It works in two stages: first training robots in simulation, then using a specialized model to account for real-world differences. This model learns to spot and adjust for variations between virtual and physical movements.

The gap between simulation and reality is one of the biggest challenges in robotics, as the research team notes. With ASAP, robots can now transfer complex movements like jumps and kicks directly from simulation to the real world.

From virtual training to athletic performances

During testing with the Unitree G1 humanoid robot, the team demonstrated various agile movements, including forward jumps spanning more than one meter. The system consistently showed better movement accuracy than other approaches.

Ad
Ad

Video: Nvidia / CMU

The team took things further by having the robot mimic sports celebrities like Cristiano Ronaldo, LeBron James and Kobe Bryant. Jim Fan, Senior Research Manager at Nvidia and head of GEAR, mentions they had to slow down the movement videos, so viewers could follow along.

The robot simulates Kobe Bryant. | Video: Nvidia / CMU

But the project also revealed hardware limitations. Motors often overheated during dynamic movements, and two robots were damaged while collecting data.

The team sees this as just the beginning. ASAP could help teach robots more natural, versatile movements in the future. They've made their code available on GitHub for other researchers to build upon.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Recommendation
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Researchers at Nvidia GEAR Lab and Carnegie Mellon University have developed ASAP (Aligning Simulation and Real Physics), a framework that reduces errors between simulated and real robot movements by about 53 percent compared to existing methods.
  • ASAP works in two stages: training robots in simulation, then using a specialized model to account for real-world differences by spotting and adjusting for variations between virtual and physical movements.
  • During testing with the Unitree G1 humanoid robot, the team demonstrated agile movements like forward jumps over one meter and had the robot mimic sports celebrities, although motors often overheated and two robots were damaged while collecting data.
Sources
Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.