GamingNews

Epic Games’ Kim Libreri on how real the metaverse can look

Did you miss a session from the Future of Work Summit? Head over to our Future of Work Summit on-demand library to stream.


If there is a small circle of people to ask how realistic we can get when we try to make a world seem flawlessly realistic in the metaverse, Kim Libreri is one of those people.

Libreri has had a long career at the intersection of movies, special effects, games, technology, and science fiction. He created the famous “bullet time” scene in the original film, The Matrix. And he made the shift from linear film to visual technology for games, and he now the chief technology officer of Epic Games. And he has made some memorable demos at Epic like Hellblade, Siren, and A Boy and His Kite. He also happens to have a cameo role in the The Matrix Resurrections film that came out in December with original stars Keanu Reeves and Carrie-Anne Moss.

I spoke with Libreri in a fireside chat at our GamesBeat Summit: Into the Metaverse 2 online event. I asked him what he thinks of the convergence of games film and the technology used to make them. Epic Games showed that with its demo for the PlayStation 5 and Xbox Series X, dubbed The Matrix Awakens.

“It’s pretty awesome to see how things are coming together. Even when we made the original The Matrix movies 20 odd years ago, it really was our desire to make the imagery for the original movies in a way inside the computer. So to actually see that is possible in real time is a kind of fulfillment of what we were all sort of dreaming of 20 years
ago.”

Webinar

Three top investment pros open up about what it takes to get your video game funded.


Watch On Demand

The Matrix Awakens demo debuted at The Game Awards in December, and it was meant to show off the technology made possible with Unreal Engine 5, which debuts in 2022. Epic created a faux city for the demo over the past year, and it will give away that city as part of the engine so game developers can use it as a foundation for their own metaverse games.

Epic knew a new movie was coming, and Libreri was friends with both film director Lana Wachowski and special effects wizard John Gaeta. Epic wanted to make a demo to get people excited about the potential for the future of gaming and visual technology. They were happy to collaborate with Epic on the demo.

“What better way to hint at what the future of gaming, and the sort of things that the metaverse is going to be,” Libreri said.

Crossing the uncanny valley?
Crossing the uncanny valley?

That demo looked pretty real. Asked how real we can make the metaverse seem, Libreri said, “When you look at a demo, it looks pretty real. I don’t think we’re perfect. But the environment, the city, the cars — we’re getting close. If you squint, it’s getting to the point where you really can’t tell the difference between that and the real world.”

He added, “I think, from a visual a purely visual perspective, I think that we’re probably one more console generation — maybe two — but I think I actually feel pretty confident that within the next five, six years that we’ll see hardware that is powerful enough, both on PC and maybe a little before the new consoles come out, we’ll see enough hardware to really blur that line completely.”

That is on the pure visual side.

“With the pixel rendering, they’re taking triangles and turning them into something that you believe is real. And that was part of the what we were trying to get across with it with the demo,” Libreri said. “But you know, there are some complex aspects. The real world is infinitely complex, and has human beings and nature, and all these random chaos generating things in the real world. So I think we’ve got a fair way, in terms of believing that a synthetic character is real from a brain perspective.”

He added, “As long as I’ve got the awesome Keanu Reeves and Carrie-Anne Moss, I can digitize them in a way that I can bring that into our engine, or into a Playstation or Xbox or PC, and produce quite convincing imagery. But to make them act and react the way the human beings would do, we’re still a fair way off on that. Solving the Turing Test (where an AI passes a Q&A meant to reveal it is not human) — we’ve got quite a quite a way to go in computer science for that.”

I noted that they have at least gotten past or pretty close to passing the “uncanny valley,” the computer science observation that the closer an animation gets to reproducing a human, the weirder it feels.

Crossing the uncanny valley is cool, he said.

“But once you can control a world and a universe that is as complex as the natural world, then you can start to do crazy things that you couldn’t possibly do in the real world,” he said. “So I look forward to all the creative and gaming opportunities and you know, just the kind of things that you can do together as a group of friends coming together in these virtual universes.”

Easy or hard to do?

Epic Games is giving away a free city with Unreal Engine 5.
Epic Games is giving away a free city with Unreal Engine 5.

As I noted in my opening speech, I keep vacillating back and forth on this notion of how hard the metaverse is going to be to build and, when when you talk about the automation of city-building in The Matrix Awakens, how easy it’s going to build these things, especially since Unreal Engine 5 is coming out this year. I asked Libreri about that.

“The assets that go into building the city — we’re going to give away,” Libreri said. “They took a huge amount of effort to make. And they’re made in a very modular way. When we started the demo, we were very cognizant that we didn’t want to have a huge, multi-hundred person team building the city. So a lot of the city is laid out from these little building blocks, almost like Lego building blocks for the corners of the buildings, the window frames, the rooftops, the air conditioning units, and then we use procedural rules to populate the world.”

He added, “What we try to do is define the rules that would lay out a city or at least try to make an effort to lay out a city that looks like the real world. And then the components are the things that we’re going to give away, including the whole project file that actually runs the entire city. On top of that, you know, we have things like the Quixel Megascans, that we’ve been adding and adding to the library that takes the need of customers and users of Unreal Engine to have so they don’t have to make rocks and trees.”

Epic also provided things like fire hydrants and telephone booths as it has digital versions of them in that library. But on top of that, Epic is also making it easier to have more sophisticated AI for the characters and the vehicles than in the past, so that the city has traffic and MetaHuman characters (those made with Epic’s tool for creating human characters).

Making the metaverse cool

Agent Smiths in a car chase in the Unreal Engine 5 demo.
Agent Smiths in a car chase in the Unreal Engine 5 demo.

In addition to these tools, Libreri said there are other things that you need to build a metaverse, such as really robust multiplayer networking to be able to connect players together. And then you also have to maintain social graphs, meaning the Epic Online Services you need to maintain a friends list and enable talk between parties.

Epic’s Unreal is also going to link to Nvidia’s Omniverse simulation product, which connects with a variety of other 3D animation tools. So things made in the Omniverse have the potential to work in an Unreal game and vice versa.

“Not everybody is an amazing artist, not everybody can make afford a real tree, or building that looks real,” said Libreri. “We obviously have a great community of developers out there across the games industry that are beginning to make cool things that they load up into Sketchfab, or other online services. Part of this is companies like ourselves making stuff. But also I think the community in the computer graphics world is doing a pretty good job of standardizing content, and also making content that that allows people to get that stuff for free, add it to their project, maybe prove it, and then pass it on to the next group of people that do something even cooler with that.”

Enabling smaller developers

A real Keanu Reeves walks into a simulated scene.
A real Keanu Reeves walks into a simulated scene.

Epic’s big push is to make the metaverse open. By making a lot of these tools available, some of them for free, it feels like smaller developers will have a better chance to contribute. Nvidia hopes to build a digital twin of the Earth in the Omniverse so it can predict climate change. And if it does so, it might make its digital twin available for free. I asked Libreri about enabling the indies.

“You saw the evolution of the VR communities over the last few years,” he said. “And that was a really open environment where people were sharing ideas. We’re all going into this uncharted territory of what what does it mean to build a virtual universe that is not just supporting gameplay, but all sorts of types of social events. I think that people are so excited about the potential that there is a lot of sharing happening. And the fact that these people are willing to share assets they’ve made, or companies are willing to give up things that would be normally very difficult to make for free, just helps people create. And the art of creation is not just making something that looks good. You have to make something that feels good, and there’s so much stuff to work out.”

Libreri thinks we’ll see an open mindset across the community as we head toward building lots of virtual universes.

Machine learning

The Matrix Awakens demo
The Matrix Awakens demo

I asked Libreri how much help would come from machine learning and AI.

“I’ll give an example of something that’s pretty complicated yet,” he said. “If you make a vehicle that’s drivable in the virtual world, then there’s a lot of dependencies on the logic, how you programmed, what the buttons do, how the physics system worked, how we interact with different terrain, beyond just the hard surface aspects of how your rendering shader — and you know USD is trying to standardize the material — you’re trying to standardize how you model and render something. But when something has to be smart, and have logic built into it. That is actually is complicated because we’re just working out ways to make them things as realistic as possible right now. And I think that with all these companies trying to work together within some understanding of what interchanges need to be for the metaverse, I do think that we’re going to eventually break through to have some standards for things that are much more complicated.”

But once you teach machine learning to do car creation or character, machine learning can do a pretty good job of creating, if you generate enough input and output data from running a sophisticated simulation, Libreri said.

“A computer can actually do a pretty good job of emulating what’s going on under the hood,” he said.

It’s hard to process all the data for an AI-driven car or character physics, but once you train a deep learning algorithm to do it, it can eventually do a good job, he said.

“For the metaverse, it may be good enough to have an inferred set of logic as opposed to actually porting over all that sort of gameplay logic that you would have to for smart objects,” he said.

And if you wanted machine learning to convert your assets when you move from one world to another world, that could be possible and quite useful, Libreri said.

The sniper and the metaverse problem

Let's hope that sniper can't see that far in Fortnite.
Let’s hope that sniper can’t see that far in Fortnite.

I asked about more challenges and Libreri mentioned Fortnite and the challenge of having a sniper on the top of the mountain and being able to view anybody on the map.

“You have a big, large scale virtual universe that you’re allowing traditional shooting type games within the environment,” he said. “Normally, the way that people would think about distributing a hugely parallel world is you’ll divide it into a grid, and players would be in little areas of that grid and move from grid to grid to grid,” he explained. “But then if you’re on the top of a mountain, and you have ultra-powered sniper rifle, and you look through it, then you can see somebody that is miles and miles away. So now you’re not only just having to communicate simple network traffic between these grid locations, you have to deal with the rendering coming from a completely different machine, or you’re having to transport everything that’s happening within the local view that you see with a sniper rifle.”

The question is how you do this when there could be hundreds or thousands of players spread across the map, which would be much larger than today’s battle royale maps. Libreri said that Tim Sweeney, CEO of Epic Games, might argue that we need a new programming language for gameplay when it comes to massively parallel simulations. And we need really clever ways of distributing the computation and data management across the cloud infrastructure.

It would be exceedingly hard to create a battle royale game with 1,000 players, instead of just 100, because of all of the networking challenges. It’s also just a big creative problem, he said. Or how do you do a concert with 10 million people in it and truly make it feel like there are 10 million people in it?

Metaverse dreams

There are 7,000 simulated buildings in the Unreal Engine 5 demo.
There are 7,000 simulated buildings in the Unreal Engine 5 demo.

I asked Libreri what kind of metaverse he dreams about. Some people say it is overhyped, and some say it is already here in the form of Second Life or Grand Theft Auto Online. I asked what some of the things are that we can do in virtual spaces that we can’t do in real life. What things exist in the real world that could be cooler in the metaverse?

He noted that multiplayer games have been around for decades where you come together and go to a virtual place. Games show the way forward.

“The cool thing for me is that I think that it gives creative people so much more freedom to think differently about what is a cool experience,” he said. “I think that you know it’s funny because the evolution of computer graphics was initially let’s work out how to match reality as best we can,” he said. “And now you’re starting to see a few more trippy uses where people start to think about what happens if physics doesn’t behave the way that physics works. What if we deal with changing scale in interesting ways, even changing the way that we look at the world. This concept of perspective, and the way we emulate perspective, in a computer is based on the way the light rays in the real world travel into our eyes. But we can mess with that inside the computer, and we can start to really do some crazy cool stuff.”

He added, “I don’t know, if you remember with the original Mario 3D game on the Nintendo 64. There’s one mind-blowing moment where you walk down this corridor, and you end up turned around to upside down. And it’s like I couldn’t do that in the real world. But I can do that in a video game. And that’s what I’m looking forward to. I’m looking forward to the joining of the real world in the virtual world.”

He said it would be cool to take the concerts in Fortnite and do them simultaneously in the game and do them in real life, where the performers and dancers and singers are in the real world, but with motion capture you could make them appear in the virtual universe at the same time.

“I think there’s so many awesome things that we’re going to be able to do that the sky is the limit,” he said.

I asked Libreri what might happen this, and what might happen in a decade. He brought up the Fox TV show The Masked Singer. It would be cool to have more audience participation, with gamification tossed in. You could also have quiz shows with huge numbers of people in the audience participating.

“We’re going to see some mind bending stuff over the next year or two,” he said. “I think the really hard problems are total interoperability between different metaverses is something that” will be very hard to solve.

He added, “If you try to be too cerebral, you’ll miss the opportunity of what’s actually cool. I think the harder problems are populating a world with AI humans that you believe are actual characters, assuming that we actually even need to do that. The the nice thing about a virtual universe, whether it’s in the world of a game, or a world of all the type of experiences, is you have people that are at the other end, whether they’re in front of their keyboard and video cameras, or on consoles, or out in the real world with an AR device. We have people so I don’t know if we totally need to solve the Turing Test right now with digital humans.”

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More


Author: Dean Takahashi
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!