Sony’s new gaming AI “GT Sophy” has been trained via Deep Learning. It can outrun the fastest Gran Turismo pros.
As the first “superhuman AI”, according to Sony, GT Sophy has beaten four of the world’s best human drivers in the racing simulation Gran Turismo. The human vs. AI competitions took place in July and October 2021.
GT Sophy outperformed the gaming pros in time trial races and in a head-to-head race in the format of the FIA-certified Gran Turismo championship. “GT Sophy showed us new possibilities that we hadn’t imagined before,” says Igor Fraga, champion of the 2018 FIA Gran Turismo Championship.
The AI was developed in a collaboration between Sony’s Playstation division, Sony AI, founded in April 2020, and Gran Turismo studio Polyphony Digital. The scientific paper, “Outracing champion Gran Turismo drivers with deep reinforcement learning,” was published in the journal Nature.
Model-free reinforcement learning for unbeatable racing AI
The current work is related to a publication from the fall of 2020: At that time, Sony published an AI for Gran Turismo Sport trained via reinforcement learning in collaboration with the University of Zurich.
It acquired its driving skills without human preparation, such as on the racing line or the game physics, solely through trial and error and specific rewards.
The now further developed AI model “GT Sophy” is also trained via reinforcement learning without a predefined model, albeit with human assistance: For some particularly demanding maneuvers, such as slipstream driving, Sony manually created exemplary training scenarios and also used sparring partners for the AI.
Sony emphasizes that GT Sophy is both fast and fair. According to Sony, through deep reinforcement learning combined with mixed-scenario training, the artificial intelligence
- has developed a deep understanding of car dynamics and racing lines,
- is able to apply different racing tactics depending on the race situation: Slipstreaming, overtaking or defensively blocking other cars,
- and adheres to “highly refined, but imprecisely specified, sportsmanship rules”. Sophy avoids collisions, for example, and respects opposing driving lines.
The following video demonstrates a complex overtaking maneuver by GT Sophy.
Sophy to be integrated into future Gran Turismo games
For AI training, Sony also developed the cloud-based computing platform DART (Distributed Asynchronous Rollouts and Training), which has access to more than 1000 PS4 consoles. Through DART, the researchers collected the training material and evaluated already trained Sophy models to further improve the AI.
According to Sony, GT Sophy could be integrated as an AI opponent in future versions of the Gran Turismo series. The AI is also a contribution to areas outside of gaming, such as autonomous driving and high-speed robotics and control, says Sony AI CEO Hiroaki Kitano.
That Sony is quite proud of what it calls an “AI breakthrough” is apparent from the elaborate official website, which documents the development of GT Sophy in detail. Sony AI wants to get even more involved with AI in gaming in the future.