The gap between simulated and real-world robot movements has always been a major hurdle in robotics. Now, researchers at Nvidia GEAR Lab and Carnegie Mellon University have developed a framework that helps bridge this divide.
Their system, called ASAP (Aligning Simulation and Real Physics), cuts down motion errors between simulated and real movements by about 53 percent compared to existing methods. It works in two stages: first training robots in simulation, then using a specialized model to account for real-world differences. This model learns to spot and adjust for variations between virtual and physical movements.
The gap between simulation and reality is one of the biggest challenges in robotics, as the research team notes. With ASAP, robots can now transfer complex movements like jumps and kicks directly from simulation to the real world.
From virtual training to athletic performances
During testing with the Unitree G1 humanoid robot, the team demonstrated various agile movements, including forward jumps spanning more than one meter. The system consistently showed better movement accuracy than other approaches.
The team took things further by having the robot mimic sports celebrities like Cristiano Ronaldo, LeBron James and Kobe Bryant. Jim Fan, Senior Research Manager at Nvidia and head of GEAR, mentions they had to slow down the movement videos, so viewers could follow along.
But the project also revealed hardware limitations. Motors often overheated during dynamic movements, and two robots were damaged while collecting data.
The team sees this as just the beginning. ASAP could help teach robots more natural, versatile movements in the future. They've made their code available on GitHub for other researchers to build upon.