GamingNews

Why ‘The Mandalorian’ cites Fortnite dev Epic Games in its credits

Disney+ is here, and one of its first big exclusive shows is Star Wars spinoff The MandalorianThe series follows a mysterious bounty hunter as he takes on secretive job following the fall of the Empire. But outside of the many Easter eggs and action scenes, fans also noticed that Fortnite developer Epic Games appears in the show’s credits. And that’s because series creator Jon Favreau has integrated Epic’s Unreal Engine tech into his film-making process.

Unreal is best known as a toolkit that developers use to design games in a creator-friendly environment. Epic uses it for Fortnite, but it’s also the engine powering Star Wars Jedi: Fallen Order, The Outer Worlds, and Yoshi’s Crafted World.

But Unreal is starting to make its way into more Hollywood productions. And Lucasfilm and FX team Industrial Light and Magic embraced it for The Mandalorian.

How ‘The Mandalorian’ uses Epic’s Unreal Engine

During an onstage discussion at the SIGGRAPH 2019 computer-graphics conference in Los Angeles in July, Favreau explained how Unreal helped with previsualization.

“We used the V-cam system where we get to make a movie essentially in VR, send those dailies to the editor, and we have a cut of the film that serves a purpose that previs would have,” explained Favreau.

But Favreau also brought Unreal onto the set to help with the entire flow of production. This uses a combination of technologies, but it primarily comes down to building computer-generated environments and then projecting them onto LED walls. Those projections then change their perspective and characteristics depending on the position of the camera and what kind of lens it is using.

You can see some of how that works in this clip from Epic:

The idea is that Unreal can provide on-set visual information for actors and other creatives. It can also provide accurate lighting so a VFX team can then come in and add the final CG geometry.

Pointing a camera at a video game

But the on-set Unreal tech is coming along so fast that a VFX team doesn’t always have to replace the LED walls and Unreal renders. Instead, what you often see in The Mandalorian is the result of pointing the camera at a display running an environment from Epic’s video game tools.

“We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine, that I didn’t think Epic was going to be capable of,” Favreau said at SIGGRAPH. “For certain types of shots, depending on the focal length and shooting with anamorphic lensing, there’s a lot of times where it wasn’t just for [actor interactivity]. We could see in camera, the lighting, the interactive light, the layout, the background, the horizon. We didn’t have to mash things together later.”

So in certain shots in The Mandalorian, what you are seeing is a camera pointed at actors on a set standing in front of LED walls displaying environmental renders from Unreal Engine. But it’s not just convincing on film.

“And it would fool people,” Favreau said. “I had people come by the set from the studio who said, ‘I thought you weren’t building this whole set here,’ and I said, ‘No, all that’s there is the desk.’ Because it had parallax in perspective, it looked, even from sitting right there, if you looked at it casually, you thought you were looking at a live action set.”

What Unreal means for the future of filmmaking

One of the most obvious benefits of on-set visualizaiton is that it gets rid of a lot of the guesswork for actors.

“Even though [the LED displays] might not hold up to the scrutiny if you’re staring right at it from close up, you’re still getting peripheral vision,” said Favreau. “You know where the horizon is, you feel the light on you. You’re also not setting up a lot of lights. You’re getting a lot of your interactive light off of those LED walls. To me, this is a huge breakthrough.”

But it goes beyond the acting. It creates situations where more people on set can understand what a shot is going to look like. More people can contribute ideas and play off of one another because they can comprehend the final shot.

“So, your cinematographer isn’t lighting to something that is greenscreen that’s going to happen later. They’re in there, telling you little intricacies, and I think, part of that, at least from what I’ve observed, is those happy accidents. Those things [that happen] because someone liked something a certain way, or those kinds of little things. That’s the magic of filmmaking.”

Favreau also thinks Unreal can take some of the procrastination out of the process.

“It forces you to make creative decisions early and not kick the can down the road,” he said. “Because you go to a set, put up a greenscreen, and figure it out later. But here you have all of these brilliant people — we have a hundred years of experience making cinema, why abandon that just because we’re disrupting the set? Let’s inherit the skillset of these great artists and build tools out.”


Author: Jeff Grubb
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!