AI & RoboticsNews

Cyberpunk 2077’s dialogue was lip-synced by AI

Cyberpunk 2077 is just a few weeks away and the action-packed RPG will launch with support for dialogue in 10 languages and subtitle options for several others. CD Projekt Red is aiming to add a deeper layer of immersion for players by using artificial intelligence to lip sync the dialogue in multiple languages.

The game’s lead character technical director, Mateusz Popławski, said in a presentation (via @shinobi602 on Twitter) that CDPR targeted better lip sync quality than in The Witcher 3. It wanted to do so in 10 languages: English, German, Spanish, French, Italian, Polish, Brazilian Portuguese, Russian, Mandarin and Japanese.

The goal was to do all of that for every character in the open world. Because of the game’s enormous scope, CDPR needed to do all of that with zero facial motion capture. To do so, the studio tapped into Jali Research’s lip syncing and facial animation tech to procedurally generate how the characters’ faces move.

Some of the characters in Cyberpunk 2077 speak multiple languages, sometimes switching between them in the same sentence. To account for that, transcript tagging was used. The tags also helped to adjust a character’s facial expressions when their emotional state changed within a line of dialogue. The system also used audio analysis to replicate the emotions of a vocal performance in the animations.

CDPR used algorithms to help it animate the cutscenes in The Witcher 3, and this is an evolution of that approach. Many games use motion capture to lip sync dialogue in just one language, which can break the immersion a bit for players who switch to a different one.

While the procedurally generated animations in Cyberpunk 2077 might not be quite as detailed or as expressive as those in some other AAA games, they could add to the experience for many players who’d prefer to play in different languages. You’ll be able to take a closer look at how that works in practice when the game arrives on November 19th.


Author: Kris Holt, @krisholt
2h ago

Source: Engadget

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!