GamingNews

Epic Games launches MetaHuman Animator to capture high-quality human faces with an iPhone

Missed the GamesBeat Summit excitement? Don’t worry! Tune in now to catch all of the live and virtual sessions here.


Epic Games has launched its MetaHuman Animator, a tool that enables developers to capture high-quality human faces using just an iPhone and a PC. And as you can see in the video of the guy here, it’s pretty amazing at making simulated people.

The company said the tool can deliver high-quality facial animation in minutes. First shown at GDC 2023, the latest version of the fast and easy digital human pipeline brings high-fidelity performance capture to MetaHumans.

MetaHuman Animator has features that enable you to capture an actor’s performance using an iPhone or stereo head-mounted camera system (HMC) and apply it as high-fidelity facial animation on any MetaHuman character, without the need for manual intervention.

Every subtle expression, look, and emotion is accurately captured and faithfully replicated on your digital human, Epic Games said. Even better, it’s simple and straightforward to achieve incredible results—anyone can do it.

If you’re new to performance capture, MetaHuman Animator is a convenient way to bring facial animation to MetaHumans based on real-world performances.

And if you already do performance capture, this new feature set will significantly improve your existing capture workflow, reduce time and effort, and give you more creative control. You just pair MetaHuman Animator with your existing vertical stereo head-mounted camera to achieve even greater visual fidelity.

More details

Actor Radivoje Bukvić delivers a monologue.
Actor Radivoje Bukvić delivers a monologue as a MetaHuman.

Previously, it would have taken a team of experts months to faithfully recreate every nuance of the actor’s performance on a digital character. Now, MetaHuman Animator does the hard work for you in a fraction of the time—and with far less effort.

The new feature set uses a 4D solver to combine video and depth data together with a MetaHuman representation of the performer. The animation is produced locally using GPU hardware, with the final animation available in minutes.

That all happens under the hood. Devs can point the camera at the actor and press record. Once captured, MetaHuman Animator accurately reproduces the individuality and nuance of the actor’s performance onto any MetaHuman character.

What’s more, the animation data is semantically correct, using the appropriate rig controls, and temporally consistent, with smooth control transitions so it’s easy to make artistic adjustments if you want to tweak the animation.

Blue Dot video

Blue Dot, a short film created by Epic Games’ 3Lateral team in collaboration with local Serbian artists, showed off what can be done. Actor Radivoje Bukvić delivers a monologue based on a poem by Mika Antic. The performance was filmed at Take One studio’s mocap stage with cinematographer Ivan Šijak acting as director of photography. And the result is that the human in the video is an animation.

These nuanced results demonstrate the level of fidelity that artists and filmmakers can expect when using MetaHuman Animator with a stereo head-mounted camera system and traditional filmmaking techniques.

The team was able to achieve this impressive level of animation quality with minimal interventions on top of MetaHuman Animator results.

Facial animation for any MetaHuman

This is an animated character — a MetaHuman from Epic Games.

Epic Games said the facial animation devs capture using MetaHuman Animator can be applied to any MetaHuman character or any character adopting the new MetaHuman facial description standard in just a few clicks.

That means devs can design a character the way theyu want, safe in the knowledge that the facial animation applied to it will work.

To get technical for a minute, that is achievable because Mesh to MetaHuman can now create a MetaHuman Identity from just three frames of video, along with depth data captured using an iPhone or reconstructed using data from a vertical stereo head-mounted camera.

This personalizes the solver to the actor, enabling MetaHuman Animator to produce animation that works on any MetaHuman character. It can even use the audio to produce convincing tongue animation.

Use an iPhone for capture

An actor was captured by an iPhone and then converted into a MetaHuman.

Epic Games wants to take facial performance capture from something only experts with high-end capture systems can achieve, and turn it into something for all creators.

At its simplest, MetaHuman Animator can be used with just an iPhone (12 or above) and a desktop PC. That’s possible because we’ve updated the Live Link Face iOS app to capture raw video and depth data, which is then ingested directly from the device into Unreal Engine for processing.

Developers can also use MetaHuman Animator with an existing vertical stereo head-mounted camera system to achieve even greater fidelity.

Whether devs are using an iPhone or stereo HMC, MetaHuman Animator will improve the speed and ease of use of capture workflow. This gives devs the flexibility to choose the hardware best suited to the requirements of a shoot and the level of visual fidelity the devs are looking to hit.

The captured animation data supports timecode, so facial performance animation can easily be aligned with body motion capture and audio to deliver a full character performance.

Perfect for making creative choices on set

MetaHuman Animator is perfectly adapted for creative iteration on set because it enables you to process and transfer facial animation onto any MetaHuman character, fast.

With animation data reviewed right there in Unreal Engine while the dev is still on the shoot, the quality of the capture can be evaluated well in advance of the final character being animated.

And because reshoots can take place while the actor is still on stage, devs can get the best take in the can there and then, instead of having to absorb the cost and time needed to bring everyone back at a later date.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.


Author: Dean Takahashi
Source: Venturebeat

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!