GamingNews

How Epic Games sees the new convergence of games and Hollywood | The DeanBeat

How Epic Games sees the new convergence of games and Hollywood | The DeanBeat

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


We’re still mourning E3, but I’m looking forward to the GamesBeat Summit 2023 event ahead, which will feature Robert Kirkman, creator of The Walking Dead. And while there isn’t anything to contemplate about E3, there is still a lot to reminisce about the recent Game Developers Conference. And in particular, I still have quite of few things to note about Epic Games’ State of Unreal event.

During the event, the Epic Games team showed off Unreal Engine 5.2, and they showed the latest advances in digital humans with demos of works such as Project M and Senua’s Saga: Hellblade 2. After that, I interviewed them about the convergence of games and Hollywood.

We talked about how they have worked in games and film and have often thought about the coming together of these two industries.

>>Follow VentureBeat’s ongoing GDC 2023 coverage<<

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.


Register Here

It’s exemplified more than ever as game properties are converted into great TV shows like HBO’s The Last of Us. The Super Mario movie is out now and Tetris recently debuted on Apple TV. The list goes on with Arcane, Uncharted, Sonic the Hedgehog, Cyberpunk Edgerunners, Dungeons & Dragons and more.

Whether they’re critically acclaimed or not, these films and shows are drawing viewers. And it’s a common occurrence for creators to conceive of worlds and to have those become both films and video games. As Anna Sweet of Bad Robot Games noted, there are different doors into these worlds.

I spoke with Tim Sweeney, CEO of Epic Games, about these trends during the recent Game Developers Conference. But Epic has an interesting bench when it comes to graphics wizardry. And so I also talked with Epic CTO Kim Libreri; Nick Penwarden, vice president of engineering; and Vlad Mastilovic, vice president of Digital Humans Technology.

They also had a lot to say about Epic’s vision for the convergence of games and entertainment.

While Sweeney devoured a bag of popcorn while I was chatting with him, none of these folks ate while I interviewed them. They were feasting on visual delights instead. I’ve embedded a number of the videos from the State of Unreal event. Please check them out if you want to be dazzled.

And here’s an edited transcript of our interview.

Left to right: Nick Penwarden, Kim Libreri and Vlad Mastilovic of Epic Games.

GamesBeat: I just interviewed Tim, and it ended with this notion that matches up with our next two events. We’re going to have a gaming and Hollywood element to them. One of the people we talked to for this said that it finally seems like games and Hollywood are in the same ecosystem. They were so divergent before. Everything they did together used to fail. Now it feels like a new time. You and your vision seem like you’re completely in the middle of making this happen?

Kim Libreri: A bunch of us worked in both industries. I spent 20 years in movies, 20 years in games now. I always played games as a little kid. When I was at university I had this dream that I was playing a Star Wars game that looked just like the movies. It was the attack on Hoth. “That’s what I’m going to do in my life. I’m going to make these two industries cross over to where a movie and a game can look the same, have the same wonder, the same storytelling, the same experience, but with one technology stack.” As a computer graphics engineer at the time, it was awesome to come to this place.

Vlad Mastilovic: When I was in high school I was quite impressed with Cameron’s The Abyss and with Kim’s work as well on The Matrix. I used to walk through the park and try to reconstruct how it was done. I had some very initial naïve attempts at re-creating people, and I was failing for years. But I was so fascinated. I just kept trying. Ultimately it led to a profession that I’m still doing.

Nick Penwarden: My own story, I used to go out on hikes and just imagine–I’d sit there looking at rivers and streams for hours and imagine how you would re-create that in computer graphics and make it real.

Libreri: The Opal demo, the one with the Rivian in it and the forest–we weren’t going to do an engine demo for GDC this year. We were going to do the metahuman animator, and obviously, we were going to do the UEFN demo, but we thought that maybe we didn’t need to do something. We would just talk about the features.

We’ve been on this big scanning expedition in California. We were starting to mess around with the assets. We’re doing this new foliage, a new light transmission model. We can now have foliage run in a sort of Nanite-compatible way. They were showing us screenshots and we thought, “Oh my goodness, that’s awesome.” With his hiking background–we asked about the performance, and they said it was pretty good. But we didn’t want to put more stress on the engine team. That whole demo started in January. With that puddle of water, we asked the physics team. “Do you think the fluid solver is good enough to do a puddle of water?” “Should be.” “How are we going to move through the environment?” “Well, I drive a Rivian, so I wonder if they’ll let us borrow the car to put it in this demo.”

They were super cool about it. “Just don’t let it do anything dodgy.” We put it in there and we have this new material system. The paint on the car is pretty awesome. It’s metallic. The car is covered in dirt, so it shows off the layers. We wanted to show off the depth of the materials. In the real world, materials aren’t just a single sheet. There’s layer upon layer. That’s what we have in the engine now. It was always called Opal, the demo. We do these dailies where we all sit down and review our content for these demos. I wanted to come up with a really cool idea to show off what we call the substrate, the material system. We just have a different name. We changed the name. Nobody came up with a good idea.

I was driving home and thinking, “Hold on. This demo is called Opal. Don’t opals have all this weird iridescent stuff?” I go online and do a quick Google search. Okay, we’ll do an opal surface. We’re in such a place now with the engine where not only does it inspire us, but our customers, we see what they’re doing. It’s almost like this perpetual wheel now of awesomeness.

Penwarden: When we first saw the Rivian coming over the rocks, where the suspension was simulating, we had to go for a pan under. That was really cool. But then someone said that when you go off-roading, you deflate your tires a bit. You’d see the tires compress and deform. So can we do that?

GamesBeat: The single ecosystem idea is interesting. Anna Sweet was articulating it in another way, where you want to build these awesome worlds and have all these different doors into it. Those doors can be films or TV or games or comic books or whatever else. All those entry points that we’ve brought with transmedia. It feels like that’s one way of understanding it, but also maybe understanding that’s how the metaverse would get built. The metaverse application is an awesome world. Then you have all these different ways to interact with it.

Libreri: Whatever the metaverse ends up being, nobody really totally knows. Our feeling is that it’s about entertainment and bringing people together to experience entertainment in new ways. Because creators of all types use our engine, that alone will catalyze this. The fact that all these TV shows now are based on games, it’s really beginning to happen. The metaverse will get defined by these experiments. You can make something once and then take the content and repurpose it and try to do all sorts of different experiences.

You think about Fortnite. Fortnite is not really transmedia, but it started off as a completely different game and different experience. We took some of the components to make the game what it is today. We’re doing that every day now with Fortnite Creative and now with UEFN. Now, because we have these awesome storytellers in the ecosystem–Jon Favreau, has a company that makes VR games. He’s looking at what’s next in interactive experiences. What’s the crossover between physical spaces like theme parks and games? How does your online presence–how do you bring more meaning to this social world that you’re connected to in the games that you play?

It’s this catalyzing melting pot of ideas where we’ll see new stuff emerge. I don’t know what it’s going to be yet. I’ve got some ideas for stuff that I would like to do. Traditional entertainment, a TV show like The Masked Singer, imagine what that would be like when you have 20 or 30 million people in an experience and you have some agency as a player with the people performing? What’s live singing going to be like, or talk shows? There are so many things we can do. We’ve had this interesting phase with Twitch evolving and people watching streamers on YouTube and stuff. But eventually, you’ll be able to live broadcast inside the world of the game with millions of people participating. Crazy stuff is going to happen.

Unreal Engine 5.2 can capture amazing shadows and lighting.
Unreal Engine 5.2 can capture amazing shadows and lighting.

GamesBeat: How do you get to this next level that everyone wants to get to with the metaverse? One thing I mean is, you’re making tools that make doing all of this stuff so much easier. You’re broadening the number of people who can create. At the same level, though, if their ambitions keep growing–we’re seeing games that take 10 years to make.

Libreri: One of the things that the world hasn’t totally explored yet, as we know, in a modern video game, is a multiplayer game at a huge scale, amazing things can happen. Now we have this new creator system with UEFN. The barriers that you have with a normal traditional development team aren’t quite there. You can even do some of the development live within a Fortnite ecosystem running the game. We’re going to see teams configured slightly differently from the way they would traditionally do it. The concept – over time, not instantly – I think you’ll start to see community-created content. Not just individual creators, but people contributing, just like in real society. People will plant a tree in a park or whatever. We’ll see a new social way of making content.

Because it’s a virtual universe, metaverse, whatever you want to call it, we’re not limited by the same rules as this planet. The same way as when we’re making a game–we branch code. We’ll branch and do this other thing over here. If it’s good we’ll bring it back. If it’s not good we don’t. The metaverse can branch. Experiences can start with one thing, and another version of the same thing with slight tweaks can be available. As we start to have these super distributed teams, and the idea that if somebody does want to put their content out there for editing in the future, where other people can add to it and expand on it, we’ll start to see some interesting stuff.

Even storytelling is going to be like that as well. We talked about this years ago. I would love for a filmmaker–it’s one of the reasons that Sequencer is in UEFN from day one. I would love to see a filmmaker tell a story and then say, “There’s all our assets and content. Recut it. Change the protagonist. Put your protagonist in there.” It’s just the beginning. The next 10 years are going to be pretty awesome.

GamesBeat: The notion that each one of these worlds–Todd Howard was talking about Starfield having a thousand worlds in it, each of them very detailed. I’d like to see those things happen where the whole world or planet you visit seems like a really big planet. Then you go to the next one and the next one. Having an experience where you can always visit the next thing on your list.

Libreri: It’s funny. As you know, filmmakers use our engine for their LED walls. I don’t know if you’ve been to one of those sets, but when you go and see everybody collaborating, they’re moving stuff around like they’re playing a video game. The level of excitement among the actors got carried away. They’re literally playing. “Well, we have to get some shots now. We have to stop playing with the environment.” We’ve started to see people making drivable vehicles in these things. Metahumans are showing up on the LED walls now. It’s the beginning of the Star Trek holodeck. When you see them go into it you’re thinking it’s a real space, and then later it’s just a normal episode of Star Trek.

GamesBeat: I wonder about how some of these creators are scoping their ambitions. The one that seemed a bit ambitious was when Brendan Greene said that he wanted to make a whole Earth-sized world, and then let people go into it and play. They could set up a town where they have a shooter or some other kind of game. It didn’t feel like there would be enough people in the world to make this thing.

Libreri: I don’t think there is, and I think on top of that, something that’s spatially like the real world, a whole planet Earth, it’s such a huge volume to populate. Technology is not quite there yet to do that. And even if we did, I see this virtual–this is what Tim was alluding to in his speech at the end. Just like the web, you have hyperlinks. You go from one place to the next. You have two things from different places all at the same time. That’s more the way the virtual universe is going to evolve.

GamesBeat: One other guy I was talking to the other day said that those worlds are going to be like websites.

Libreri: Yes, it’ll be much more like the web. You may make your own sub-world. I go to this location a lot, so I’ll put that here. Across the road, I’ll put this here. People are putting too much emphasis on trying to map our terrestrial experience of being a human being on this big rock floating in space–mapping that onto the virtual universe. The virtual universe is going to be everything, every world, all at once. People selling real estate–real estate in a virtual world is meaningless. There’s an infinite supply of it. I don’t really want to travel infinitely across it.

Inworld AI is bringing better AI to non-player characters.
Inworld AI is bringing better AI to non-player characters.

GamesBeat: I saw a demo that was very interesting where I was talking to an NPC, having a conversation. It had the latest AI. The question I asked was, “Is there anything else you want to tell me?” Then I got the whole answer from the NPC about why they were there and why I was talking to them. If I hadn’t asked that question, I wouldn’t have gotten the whole point of why I was talking to them. I thought that was fun. Now I can be an investigator, a detective who has to ask the right questions to get the right answers. That part felt like–if game designers can design that kind of experience–some people just want to ask two questions and be on their way. Whereas maybe there’s a shopkeeper or something that I’ll visit 40 times. I don’t want to hear the same conversation every time.

Libreri: In traditional RPGs I find it difficult when it’s just dialogue trees and you go through every question. I’d rather have it be a bit more organic and discoverable and unscripted.

GamesBeat: In that case, in smaller worlds it’s fine. You have smarter entities within each world.

Penwarden: Imagine playing an RPG where you actually had to walk from city to city. You’d waste so much time getting there. You don’t want to do that.

GamesBeat: What things do you see that are in motion that makes you think you’re looking at something new, something that you didn’t expect so soon? What do you see that feels metaverse-like?

Libreri: Outside of the metaverse side of things, one thing that–I’ve worked in the world of computer graphics now for 30-odd years. We’ve gone down a path where we’re trying to completely simulate the real world. One thing that does excite me is that, finally, we’ve started to think about how to solve problems in a totally different way. It’s difficult for some engineers. Some engineers really want to know everything about every pixel and every bit of AI, everything that’s happening. That’s the nature of an engineer, to understand it all and build a system that can take advantage of understanding everything.

With all this new deep learning technology that is becoming available to us–I’m not really on about the ChatGPT stuff or the generative AI stuff. It’s more about the ability to have a machine be pretty good at filling in the gaps and making something look and feel like the intuition of a human being.

I’ll give you an example. Look at a painting. Right now, if we try to render a jungle in the engine, a jungle from a distance and it’s perfect, absolutely matches exactly what’s happening, this is not practical in real time. Not today. The way we would have done this with traditional computer graphics, in a movie or a game, is by making every leaf. Every leaf has light transmission, focus, and drops of rain on it. You can’t possibly do that for a billion leaves in the Amazon rainforest. But I know many painters where I can say, “Paint me something that looks like a distant jungle.” Do they know that all these specific things are happening? No, they just have an intuition for it.

Epic Games' The State of Unreal at GDC 2023.
Epic Games’ The State of Unreal at GDC 2023.

What all this deep learning allows us to do is have machines have some level of intuition so they can fill in the gaps. I’m excited about what that means not just for computer graphics rendering, but all sorts of things. We can give richer experiences and more believable results without the brute-force way of doing it. You think about AI for player characters. Right now it’s a state machine where if you have a brilliant programmer that can predict every possible option and move–it’s not an efficient process. Now we have all sorts of new horizons. It means that our design teams and engineering teams think of things very differently from the way they would have even just five years ago.

GamesBeat: For Senua’s face, from 2016 to now, what was different to you about the face? It looked pretty good in 2016.

Mastilovic: Maybe a little bit earlier than that, what was fundamentally different was that companies would usually want a few hero characters in which they invest millions, and then a population that doesn’t have to look as great. What was fundamentally changed was that now we want all of them to be equal in quality to the hero characters. That was posing a huge problem. How do you multiply that big budget to hundreds of characters in the streets?

This is where the metahuman product was born. That’s basically the answer to that requirement. At the time we were working with Cloud Imperium Games. We talked here about populating an Earth-sized planet. They’re actually intending to populate a universe of parametric planets. They made some tech that looks pretty decent, but the problem is that it’s kind of boring. There’s too much space. What’s different is that we have now accomplished this equal quality across the board. Of course, the heroine looked better than the rest of the metahumans, but that gap is getting narrower.

We’ve obviously succeeded in democratizing the creation of assets and animations. What we’re also thinking about is how we can now instance unique characters in the world. When UEFN takes off and people start making these experiences with 100 high-quality characters, we don’t really want people to download all of these assets themselves. We need to come up with some sort of generative way of sharing only the descriptions. That’s obviously not solved yet. That’s part of our future. But we see this metahuman DNA format, which we mentioned in the presentation, as a key element of that. DNA is actually quite a small description of what a metahuman is. Everything else gets unpacked from parametric models that can synthesize the rest. That’s on our side. The thing that Kim mentioned about NPCs being boring, we’re also quite excited about autonomous characters. That’s part of our future as well.

Libreri: On my side of things, first of all, we have much better facial technology that now moves the way Melina’s face moves on Senua. We’ve shot her and scanned her, hours of performances now. Our systems know exactly how her face can move. On top of that, because we have Lumen at this point, when we light, instead of being very artisanal–the original demo we did on the stage with the beach and stuff, was so handcrafted. Not in terms of how the rocks looked. There were some photogrammetry rocks, but low quality. We couldn’t use Nanite. To do that now – even the simple demo that we did, which was only done in the last few weeks – is so much easier from a crafts perspective. It comes down to the imagination of the artist.

Senua from the upcoming Senua's Saga: Hellblade II.
Senua from the upcoming Senua’s Saga: Hellblade II.

Lighting at this point is more about what story we want to tell with the lighting. When we wanted to get something like a flickering flame, it was so hard to do these things in the past. Game engine technology just couldn’t match the real world. Now it’s so much closer. We’re not there yet. We still have work to do on the eyeball rendering, how you get down to that micro-detail but still look good from a distance. But it’s so much easier. In fact, almost too easy. The artists designing that sequence at the end got carried away with an infinite amount of possibilities. Every day it would change. You’d never be able to do that in the past. You’d have an idea at the beginning and you’d need to execute that. But now you can do a change at the eleventh hour in a way that would have been impossible.

GamesBeat: The demo of the Rivian driving through the stream was cool because the mud washed off the car.

Libreri: It did wash off! You noticed that.

GamesBeat: That doesn’t seem easy.

Libreri: It was just one tech artist that did all the secondary effects for the car. The fluid system talks with Niagara. The Niagara system can talk to the texturing system. They can have all this information go between all these systems. We’re able to paint the side of the car with–the fluid system says it’s ejecting water particles. They’re Niagara particles. They collide with the body. What’s the texture of the body? Okay, it can paint into that. We can do a lot of cross-communication between systems to allow that level of immersion.

Penwarden: Giving artists the expressivity and programmability in all of these cases, so they can program custom effects and logic into Niagara for the fluid simulation and the materials. They can tie everything together.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.


Author: Dean Takahashi
Source: Venturebeat

Related posts
AI & RoboticsNews

Microsoft brings AI to the farm and factory floor, partnering with industry giants

AI & RoboticsNews

Edge data is critical to AI — here’s how Dell is helping enterprises unlock its value

AI & RoboticsNews

Box continues to expand beyond just data sharing, with agent-driven enterprise AI studio and no-code apps

Cleantech & EV'sNews

Porsche launches three new Taycan EV models, adding more performance and range

Sign up for our Newsletter and
stay informed!