Think 5-10-15 years when everyone has gigabit+ fiber to the home.
Latency is much harder to reduce than it is to increase bandwidth.
At a certain point you're also limited by the speed of light, round-trip latency for halfway accross the world cannot be physically less than ~200ms (unless our knowledge of physics advances and SoL is no longer a limit)
It probably works best if the game engine cooperates. But that's not necessary. You can just split processes on the OS and run each different bit of user input in a different process with no cooperation from the process. (Though I admit this might be tricky on current hardware and heavy games.) Given enough compute and bandwidth, you could do this continually.
In theory, with unlimited compute/bw this means you can have local latency (just the cost of input/stream switching) because you could speculatively execute every possible input to the game, all the time, out to the latency duration. In practise, it'll probably prune things based on the likely inputs and only speculate a bit out. This is probably enough to provide a smooth experience for most users that aren't playing competitively.
If you think about a game as a mapping from a limited set of user inputs to a 2D image, some optimizations start coming out, I suppose.
But that sounds almost impossibly computationally expensive for 3D games and the like. Furthermore most game inputs aren't discrete but continuous, making the problem even hearder.
Do you have a link to the paper?
So the next big breakthrough in data transmission will be neutrino rays....
The latency of the human mind is around the same. There are lots of tricks like predicting the future game state that can result in a better user experience.
Actual perception times are much lower than that, about 13ms [2]. You can see the difference for yourself by looking at a 30FPS (33ms) and 60FPS (16ms) video [3], and the effect is much greater when you're actually providing the inputs.
[1]: https://stackoverflow.com/questions/536300/what-is-the-short... [2]: https://newsoffice.mit.edu/2014/in-the-blink-of-an-eye-0116 [3]: http://www.30vs60fps.com/
Yes, but people have been playing online games for a while now and latency was always there. I don't see why it would sudunly become a problem.
They'll just choose the closest server (i.e. the one that is not accros the globe) as they always did.
Whether the service is "rendering", "web" or anything else doesn't make a difference wrt. latency.
If the rendering is distant, the time until you will see that bullet shoot becomes 100ms instead of 50ms.