GamingNews

Epic Games and Ninja Theory show off Senua’s face from Senua’s Saga: Hellblade II

Epic Games

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


Epic Games showed off some new technology along with the developers of Microsoft’s Ninja Theory for Senua’s Saga: Hellblade II.

The Metahuman Animator is a tool that can be used to create extremely realistic animations, based on video captured from human actors and almost instantly converted into an animated structure that can be used to create 3D animations for games and films.

It was one of the cool demos that Epic Games showed at its State of the Unreal event at the Game Developers Conference in San Francisco today.

>>Follow VentureBeat’s ongoing GDC 2023 coverage<<

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.


Register Here

Epic showed off the tech through the Ninja Theory game, which is a sequel to Hellblade: Senua’s Sacrifice, a game with outstanding human animation from 2016. Melina Juergens, the motion capture expert and lead actor for the game, made an appearance to show how the same tech can now be used on an iPhone as a metahuman animation. It works with the Livelink face application on mobile devices.

The tech can generate a face model from a few captured pictures within a minute or so and convert it into something that can be used in computer-animated films or games. Ninja Theory also gave us a glimpse of what Senua will look like in the upcoming Hellblade II title.

Project M

Epic Games also unveiled new tech to make it easier for creators to create using cool 3D animations.

NCSoft’s Songyee Yoon, president and chief strategy officer, showed off imagery from Project M, an upcoming game. It’s an action-adventure game with extremely realistic graphics and stellar human animations.

Project M from NCSoft.

Yoon was on stage to introduce the company’s latest project.

In the trailer, a digital human version of NCSoft’s chief creative officer (CCO), Taekjin Kim, appeared on screen and guided the viewers through Project M’s world and core gameplay.

This digital human was developed utilizing NCSoft’s AI technology and its advanced art and graphics technological capabilities. The trailer’s digital human speech voice was generated from the company’s AI text-to-speech (TTS) synthesis technology. It is used to translate text information into natural human speech reflecting a certain person’s voice, speech accent, and emotions.

The digital human facial expression and lip-sync were generated with the help of the company’s voice-to-face technology. It is an AI-based facial animation technology that automatically produces facial animation, matching the given text or voice. The AI technology and the company’s visual technologies created the digital human’s realistic facial look.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.


Author: Dean Takahashi
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!