Company |News | Date published: Feb 9, 2022
Gran Turismo Sophy: Training AI to be a Champion-Level Racer
AI breakthroughs to date range from AI competitors who have mastered strategy games such as chess, shogi, and Go to more complex, real-time multiplayer strategy games. Gran Turismo Sophy, a collaboration between Polyphony Digital, Sony AI, and Sony Interactive Entertainment, takes machine learning to the next level by introducing an AI agent to a hyper-realistic racing simulation that requires rapid decisions within the complex dynamics of a race against opponents.
Sony AI COO Michael Spranger describes Gran Turismo Sophy as “an AI agent that learned to drive by itself at a very competitive level and is able to compete with the best GT Sport drivers in the world.” Trained via a technique called reinforcement learning, Gran Turismo Sophy began as a blank slate and evolved from an AI that could barely maintain a straight line on a track to a racer that pushed the best Gran Turismo (GT) Sport drivers in the world to their limits.
Please check out a video message from CEO of Sony Group, Kenichiro Yoshida, and read the latest from Sony Corporate Blog about this fantastic project
The First True Test
Gran Turismo Sophy’s large-scale training began in January 2021 and, after being put up against various Polyphony Digital team members and top GT drivers, Gran Turismo Sophy was ready for its first major test.
At the 1st “Race Together” event on July 2nd, 2021, Gran Turismo Sophy faced a team that included Takuma Miyazono, the top Gran Turismo Sport driver in the world, and three other top drivers. The AI agent was utilized on a team of racers and the competition was set across three tracks and three car combinations.
In timed trials, racers took on tracks solo and logged their times. Gran Turismo Sophy also raced the same courses solo and beat the racers times. However, playing one-on-one with players was a different challenge. Gran Turismo Sophy didn’t fare so well when sharing the track with other drivers in a competitive team race.
“I think we all underestimated how hard it would be to get the sportsmanship side of it right, and learn to do that without being overly aggressive or overly timid in the face of competitors,” said Peter Wurman, Sony AI Director and Project Lead.
To find out more about how Gran Turismo Sophy handled the complex sport of race car driving in Gran Turismo Sport check out the latest cover story in Nature, Outracing Champion Gran Turismo Drivers with Deep Reinforcement Learning.
Racing Done Right
Further, an AI agent’s performance can only be as strong as the platform it uses and Polyphony Digital CEO, Kazunori Yamauchi, and his Gran Turismo team captured the dynamics and physics of the sport where other similar games only get racing physics partially done correctly.
“When cars, and the entire culture around cars, were appearing in video games, I wanted to recreate that,” Yamauchi says. Polyphony Digital Engineer Charles Ferreira adds that the “realism of Gran Turismo comes from the detail that we put into the game” from the engine, tires, and suspension to the tracks and car model. That realism initially presented a challenge for the Sony AI team, but Gran Turismo Sophy eventually was pushed to new heights.
Training with Mass-Scale Infrastructure
In reinforcement training, Gran Turismo Sophy was graded with positive or negative feedback based on different inputs. Some of those inputs were how fast it was going, which way the wheels are pointed, the curvature of the track, etc. Mirroring how humans require an estimated 10,000+ hours to become proficient at a skill, Gran Turismo Sophy duplicated itself and took on multiple different scenarios simultaneously. This required a great deal of computing power, which was provided by Sony Interactive Entertainment.
“With a standard AI simulation, a model is created and then run. Analysis occurs and updates are then added to that simulation and run again. This process can sometimes become extremely time consuming,” said Justin Beltran, Sr. Director, Future Technology Engineering, Sony Interactive Entertainment.
“However, leveraging SIE’s worldwide mass-scale cloud gaming infrastructure, the Gran Turismo Sophy team was able to successfully run tens of thousands of simultaneous simulations using state-of-the-art learning algorithms and training scenarios in a significantly reduced period of time,” continued Beltran.
Returning to the Track
On October 21, 2021, the second race day arrived along with the hopes that Gran Turismo Sophy would win every competition, including the team race. Not only did it dominate across the board, but the team also witnessed it adapt to a tough situation when it had a wipeout early in the third race and still went on to come in first place. It won 1st and 2nd place in all three races and won the team score by double points against the humans.
However, the future of AI agent’s is not meant to replace human interaction, instead, expand and enrich the gaming experience for all players. “We’re going to create artificial intelligence that will unleash the power of human creativity and imagination,” says Hiroaki Kitano, Sony AI CEO.
But what does this mean for the future of gaming? In short, the possibilities are endless!
“We envision a future where AI agents could introduce developers and creators to new levels of innovation and unlock doors to unimaged opportunities,” said Ueli Gallizzi, SVP of Future Technology Group, Sony Interactive Entertainment. “We could see unexpected levels of user engagement, better gaming experiences, and the entry of a whole new generation into the world of gaming.”
We can’t wait to see what lies ahead as the worlds of artificial intelligence and interactive entertainment are bridged and Gran Turismo Sophy is the next step in this exciting adventure.