AI & RoboticsNews

The ‘breakthrough’ AI racing agent GT Sophy debuts for Gran Turismo 7

Gran Turismo 7

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


Sony AI and Polyphony Digital have released Gran Turismo Sophy, a hyper-realistic AI racing agent for Gran Turismo 7.

The agent is based on research by Sony’s AI researchers and Polyphony Digital’s developers and it fits Gran Turismo’s mission of being the most realistic racing simulation game.

The PlayStation 5 agent was trained on a neural network and moved from a research project to its commercial release after two years of work. It learns the best way to drive through reinforcement training, and then it’s ready to take on humans. The aim isn’t just to put on a good show, as with many racing games. The mission is to create AI that will challenge and even beat the best human racers.

“For the first time, actually, this is a really competitive AI. The built-in AI has been part of Gran Turismo since the start of the franchise,” said Michael Spranger, chief operating officer of Sony AI, in an interview with GamesBeat. “But that AI is a very narrow band. It is an intermediate AI that is too weak to train against humans. It takes some time actually to build an AI that is able to compete at the highest level. And that’s what we did. And that’s based on a technique called reinforcement learning.”

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.


Register Here

Gran Turismo Sophy Race Together is a new mode gives Gran Turismo players of all levels and abilities the opportunity to go head-to-head against GT Sophy in GT7. The special mode, available as a time-limited in-game event, is a first look at GT Sophy in GT7 and is designed to maximize the fun and excitement of racing against GT Sophy for everyone.

A history of so-so AI

GT Sophy is on the track to beat human drivers.

AI is nothing new to racing, as it’s been around for the whole 25-plus year history of the Gran Turismo racing franchise, as players race against AI-driven cars. But this kind of AI is different and more adaptive to the racing situation.

“Prior AI, which has been mostly the same for the last 20 years, tries to follow a line and a particular trajectory. So it’s trying to hit certain speeds at certain points,” said Peter Wurman, director of Sony AI America and project lead on GT Sophy, in an interview with GamesBeat. “And it’s very predictable. And it’s not nearly fast enough for really good (human) drivers.”

A modern AI approach

Gran Turismo Sophy works with multiple tracks.

This mission was a lot more ambitious. Sophy is named after Sophia, the Greek word for wisdom. And it is a combo of Sony and Polyphony Digital.

“The overall challenge was to train an AI agent that could race against the best players in the world and beat them,” Wurman said. “It involves technical challenges on driving of course but also dealing with the balance between being assertive and aggressive in the sense of trying to pass but also adhering to the rules of sportsmanship.”

Spranger said that traditional AI in games probably had cheats that the developers used to make the racing more exciting for players. But in Gran Turismo, the AI has access to the same physics of the game as a human.

Kazunori Yamauchi, creator of the franchise and CEO of Polyphony Digital, is basically a racing fanatic who wants the game to be real. This is not to say that the game’s past AI was bad. Yamauchi has focused the game on realism for its entire history and he is still working on it today. Yamauchi worked with Sony AI on this project, which is more ambitious in its goals.

For aficionados of racing games, the use of cheats to fake the intelligence of AI is easily detected.

“People are not stupid, right? So if you cheat, people will notice. Suddenly, this car, which was very slow before, comes up behind me within a matter of a few seconds,” Spranger said. “In order to create a real racing experience, it’s very important that the AI really has the same access to the physics and it’s not like some privileged access, where you can sort of go around the corner faster than you might ever do.”

The tough part is when cars are evenly matched, you might try to hold a driving line and beat the other car by blocking the path forward while going around a curve. A good driver might draft in a slipstream behind a racer, gaining additional speed, and wait for the chance to pass, either on the inside or outside.

“If you do it really well and they go wide, you can cut on the inside and then pass,” he said.

AI reinforcement training

Sony AI and Polyphony Digital went from a Nature paper to a commercial release in a year.

Neural networks started working well about a decade ago, incorporating graphics processing unit (GPU) technology to handle AI processing. That general trend has unlocked many potential applications for deep learning neural network AI that wasn’t previously possible for apps such as computer vision.

With today’s computing power, these networks can be trained to learn tasks and improve over time.

“Reinforcement learning has unlocked the potential for this application. There may be other ways of approaching this,” Wurman said. “There is also an academic research community working on autonomous racing and so on. And this is really the first time I think that we were able to show the superhuman simulations in the real world, as it is a very real physical simulation.”

Reinforcement training is a trial-and-error process. Designing the training took a couple of years, but the actual training took about two weeks or so with 20 PlayStation 5 consoles, Wurman said.

The company used a cloud infrastructure for training the agents to scale the experience that agents have. It collaborated with Sony Interactive Entertainment’s infrastructure teams on that. Sony AI has about 120 employees working on AI projects in Japan, Switzerland and North America.

“Exactly one year ago, we had a very big scientific breakthrough,” Spranger said. “And that breakthrough is actually training, or reinforcement, for any agent to drive competitively against the best human drivers in Gran Turismo.”

The challenge of making it real

Can you beat an AI driver trained to have the best skills?

“The agent Gran Turismo Sophy has to master different skills. These skills make it different from other AI breakthroughs in this space. And the key thing that is different, for instance, from other kinds of game AI breakthroughs in the past, is the physical realism of the game. Gran Turismo is an extremely realistic physical simulation of racing.

In fact, there are people who are race drivers who started learning about racing by driving in Gran Turismo first. That’s what Sony is making a movie about based on the Gran Turismo franchise.

Pro racers actually use “racing sleds” with steering wheels and pedals, with a big screen in front of them. Some players have gone on to become real racecar drivers. Of course, learning this way is easier as there are no g-forces in a car simulation in comparison to real racing.

“It still requires this intense amount of concentration for long periods of time,” Wurman said.

The results could vary, based on the drivers themselves as well as tracks and road conditions.

Early in the process, advisers at Polyphony Digital gave the AI team pointers on really good driving. The devs also offered feedback over time, Wurman said.

The agent has to master driving a car at the edge of control. Racing is basically pushing the car almost over the edge of losing control. And the agent was able to master that.

So the second thing to master is the intricate physical change that happens when you have opponents on the track.

If you’re driving behind a car your top speed gets higher, but you might also take longer for braking and you have to deal with sudden changes between other objects on the track, especially your opponents.

And if you want to overtake especially humans, it involves a kind of action and reaction back and forth. If the AI makes a move, a human isn’t just going to let you pass. Also important is being collaborative. You can’t just bump into other cars. On a racetrack in real life, such behavior makes the difference between life and death, Spranger said. Driving safely and aggressively requires an intelligent balance.

“It’s about avoiding hitting other cars,” he said. You need to drive aggressively, or others will just take advantage of you. We train progressively for the AI to make better and better decisions in the game. Relatively quickly the agent becomes a good driver. And it takes more time for the agent to become a top driver.”

The cars also have slightly different performance characteristics. And there are different levels of performance for neophytes to experts. GT Sophy operates at a disadvantage at the beginner level, and it levels out as you get closer to the expert.

“In the last mode, you actually have exactly the same car as another GT Sophy driver so you can really try yourself against a superhuman agent,” he said.

How it works in Gran Turismo 7

Wet.
Wet.

In GT Sophy Race Together mode, players can race against GT Sophy in a series of four circuits of increasing difficulty, as a Beginner / Intermediate / Expert driver. In each of the four races, the player races against four GT Sophy cars of different performance levels.

Players can also challenge GT Sophy in 1VS1 mode, where GT Sophy and the player race one-on-one with identical car configurations and settings, which showcases the superhuman racing skills of GT Sophy. The excitement of GT Sophy Race Together Mode is enhanced with GT7’s new emoticon feature, which displays emoticons on the GT Sophy cars throughout the race to react to the in-game action.

The GT Sophy Race Together mode, part of the 1.29 update for GT7, arrives on February 20 at 10 p.m. Pacific time and is available until the end of March. The mode can be accessed directly from the top right panel on the GT7 World Map, and the player can start a race against GT Sophy once the player has reached Collector Level 6.

This is the first of an ongoing series of GT Sophy features that will appear in GT7. Over the past couple of years, Sony AI, in partnership with PDI, has continued to evolve GT Sophy’s capabilities and work toward the goal of making it accessible to the greater Gran Turismo community.

Player feedback on this initial special feature will be used to continually improve the GT Sophy Race Together mode feature for future releases. Wurman is excited to hear the feedback once millions of game fans have used GT Sophy.

Sony AI talked about the agent research in a February 2022 issue of the journal Nature. Sony AI Inc. was founded on April 1, 2020, with the mission to “unleash human imagination and creativity with AI.”

Where AI is going

Sony AI hopes to take reinforcement training to new levels across Sony.

Sony is working on a variety of AI technologies beyond racing simulations.

Sony AI aims to combine research and development of artificial intelligence with Sony Group’s imaging and sensing technology, robotics technology, and entertainment assets such as movies, music, and
games to accelerate Sony’s transformation into an AI-powered company and to create new business opportunities.

To achieve this, Sony AI has launched four flagship projects to date aimed at the evolution and application of AI technology in the areas of gaming, imaging and sensing, gastronomy, and AI ethics.

The company eventually hopes to make use of the tech in the Gran Turismo version for the virtual reality experience of the PlayStation VR 2 headset, which debuts on February 22.

Can they put the GT Sophy racing agent into a real car and make it into a self-racing vehicle? Not yet.

So where does the AI go from here? Spranger said the teams will keep working on the technology for the future and perhaps bring it to other games. Sony is also doing autonomous driving research for real-world cars, and it’s also launching its own Afeela car in partnership with Honda in 2026.

“For general reinforcement learning, I do think it’s going to unlock next-generation gaming experiences in the future,” Spranger said. “For Sony, in general, I think AI is a key technology in those areas from sensing to the virtual world and back. And I think we can be a great contributor to technology advances in this space. That’s the mission the CTO of Sony (Toru Katsumoto) gave us.”

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.


Author: Dean Takahashi
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!