GamingNews

How Google Stadia’s ‘negative latency’ might work

Google Stadia is going to feel more responsive than local hardware … eventually. At least, Stadia engineering boss Madj Bakar is making that claim in an interview in the latest issue of Edge magazine (via PCGamesN). That doesn’t seem possible, right?. Latency is inherent to sending video over the internet. You cannot break the laws of physics. Except, the laws of physics never counted on what Google is calling “negative latency.”

Stadia hasn’t launched yet, so gaming fans are dissecting the company’s every statement to figure out what to expect. And something like “negative latency” is ripe for that kind of speculation (and ridicule). But what is it — and can it really make Stadia more responsive than local hardware?

Maybe, but it’s going to require a lot of work on Google’s part.

Negative latency is a suite of techniques that Google will use to mitigate the lag between your screen and Stadia’s servers. The idea is that Stadia’s network of super-powerful gaming GPUs and CPUs will often have enough spare power for some clever tricks.

Extreme framerates

One of the examples of negative latency in the Edge story is running games at an extremely high framerate. This is well-known technique to eliminate input lag.

Counter-Strike: Global Offensive players often try to run that shooter at 400 frames per second or higher even on a 60Hz display. That’s because even if the monitor cannot render most of those frames, when it does begin to display the next frame, it’ll use the data from the most recent possible input data. This can shave off a significant amount of perceivable input latency.

That effect should work exactly the same on Stadia.

Predictive inputs

Superfast framerates are not what most people took notice of, however. Instead, Stadia skeptics are worried about the service predicting user inputs.

Now, I’ve reached out to Google for clarification about this, but it hasn’t returned my request. As far as I know, it hasn’t clarified exactly how this will work. So I don’t blame people for thinking that this means that Google is going to play the game for you. That’s what I thought when I first read it. But that’s probably not what’s going to happen.

Again, Stadia theoretically has enough power to render multiple instances of the same game for every player. With that in mind, a predictive-input technique could use computer learning to understand what a player is likely to do in any given moment. Stadia could render the top three of those likely outcomes so that they are ready to return to the player the second the actual input reaches the servers.

Time-traveling

But Google might not have to rely on prediction. It could just send your inputs back in time. This is something that the emulator front-end platform Retroarch implemented in a feature called “runahead.”

Here’s a good explanation of how it works from the blog Filthy Pants:

“The way it works is whenever the player’s input changes, you roll back one frame and apply the new inputs retroactively and then emulate two frames to catch back up. This makes your inputs go into effect one frame before you actually pressed the button.”

Runahead is the real deal. It makes emulation less laggy than real-world hardware. Super Mario Bros., for example, has two frames of lag between pressing jump and when Mario begins his in-game jump on NES. RetroArch can reduce that to 1 frame.

This is another example of how Google could use the power of Stadia to reduce perceivable latency. Modern games are too complex to render in the way Filthy Pants describes. But maybe Stadia’s endless computational cloud could handle it.

Negative latency can only fix what Google can control

I think it’s a wild claim to suggest that Stadia is going to have less latency than console in a year or two. But, then again, I’m not trying to sell Stadia. Still, what Bakar is suggesting isn’t impossible.

All gaming setups have lag. Controllers have to send signals over the air. Consoles have to render frames and then send them to a TV that might have terrible, laggy effects. The entire process could take more than 100ms.

For Google, negative latency isn’t about transferring inputs and frames to/from the server faster than the speed of light. It’s about mitigating all of the other sources of latency. So will that end up feeling better than local hardware? Absolutely — if your definition of local hardware is a console playing games on a TV at 30 frames per second. And it’s likely that’s exactly what Google means.


Author: Jeff Grubb
Source: Venturebeat

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!