Experiments

Interactive Latency Compensation Strategies for Games: part 1
Posted By Tom Larkworthy, June 15, 2014

Dealing with lag in multiplayer games is important to get right, otherwise the players stutter or get drawn in completely the wrong place. I wanted to play with some different latency compensation schemes to get a feeling for the problem myself.

The above demo is two separate applications communicating through a Firebase server located in the US. There is a real network delay between them so you can get the two player experience without two devices.

Technology

I have used Construct 2, the HTML game builder, to build an interactive prototype. I find this program brilliant for getting feedback quickly in dynamic experiences. It exports cross platform code that works out the box on mobile and desktop. Even though I am a programmer, I still find I can work faster with Construct 2 for small projects like this.

I used Firebase for the networking. Again, its very easy to setup and hides most of the vulgarities of real-time networking. I wrote the Firebase construct 2 plugin in 2013. Disclaimer I now work for Firebase since 2014.

Firebase

The essence of Firebase is that the data is a big JSON on their servers. You can write to specific locations on the data tree, and listen to locations on the tree. When a listener is attached, all connected clients are notified in real-time when the data changes (as much as any internet technology is real-time).

Firebase instantly propagates local changes to local listeners. This simplifies network programming greatly, as you don't need to consider local modifications of data differently to remote modifications. Just write all data to the server tree, and have your model driven directly off the server notifications.

One advanced feature exploited in this demo is that the server can exchange a timestamp token with the Firebase servers time value.

Latency estimation

The time it takes for data to be transmitted between two computers is called the latency (or lag in gamer parlance). Latency introduces a delay between the server's view of the game and the client's. The server will see updates from other players before a particular client does, and a client will see local changes before the server.

We can estimate latency by pinging the server and measuring the time it takes for the message to come back.

latency = (ping_recived_time - ping_sent_time) / 2

However, latency is not constant. It is dependant on lots of things outside our control like internet congestion. So we need to keep pinging and measuring the result constantly.

Worst still, an individual latency measurement is subject to a lot of noise. So the results of calculations requiring accurate latency estimations will jump around every latency measurement. To reduce the effect of noise, we can use an exponential smoothing filter. Its not the best filter, but absurdly simple to implement:-

latency_next = k*latency + (1-k)*latency_prev

k is the forgetting factor (between 1 and 0). The filter takes a weighted average between the new data and the previous data, biased by the forgetting factor. A k of 1 says ignore the historical values and trust the latest data fully. Low k values trust new data less, which smooths the signal from perturbations by random noise. k is a setting you can play with in the demo.

Clock Skew

One issue that pops up early is that clients clocks are rarely telling the same time. Across my PC, Mac and Nexus 4 I have measured a clock drift of 100ms, 500ms and 1600ms from the Firebase servers. You can't trust clocks, but they are an a critical element to performing latency compensation.

Firebase servers provide an authoritative time value through the timestamp functionality. To estimate local clock skew we need to measure the difference between the clients local clock, and Firebase's timestamp. However, we can only measure the Firebase's timestamp via the network which is delayed by latency and subject to temporal noise.

To estimate clock skew, every time a timestamped networked packet is received, we update the clock offset, again using a filter:-

t_offset_next = k*(t - server.t) + (1-k)*t_offset_prev

I used the same value for the clock offset smoothing k as the latency smoothing k, which you can update in real time. The latest smoothed estimates of the latency and clock offset estimates are updated at the top of the demo. You should find with a k of 1 the numbers change dramatically than with a k of 0.1

Interpolation

It is both wastefully of bandwidth, and a poor experience, to try and transmit the position of everything at a fast rate to give the experience of movement over a network.

For objects like bullets, which move at a predicable rate in a predictable path, its better to transmit the the initialisation parameters and allow the other players to update the bullet's temporal progression using their own clocks.

bullet_pos = initial_pos + time * velocity

However, there are some nuances around whose clock to use.

Server Time

It is tempting to use the server's clock as the authoritative time reference (after a correction for clock skew). After all, you expect the server to play referee for security. However, you should see the problem if you fire a constant stream of bullets for 30 seconds. Occasionally the stream of bullets pause on the remote player, and when the stream returns, all the previous bullets are clustered together like a shotgun.

What has happened is a single packet got delayed, and due to TCP ordering guarantees the server effectively paused processing all subsequent bullets until the lost packet was retransmitted. So when it did catch up, many bullets were processed at the same time with a similar timestamp giving the shotgun effect.

Maybe the effect is harder to replicate in the US than it is here in the UK, so try the demo on a mobile browser instead where packet loss is much more frequent (esp. with low signal).

Remote Time

Because the server can't know when the packets were transmitted, the only way to recreate the bullet steam correctly is to use the remote client's clock. Now when a player catches up, the bullets are created in the correct place. This requires stamping the bullets data packet with the (corrected) local time before transmitting them to the server. Try out the difference between the remote time and server time on bullet streams in the demo.

Player Movement

Unlike a bullet, players don't move predictably. In the demo I transmit the players position and velocity every 100ms. Between packets, I use exactly the same code as the bullets to infer movement based on interpolation. If there are transmission delays, the remote players position keeps moving and desynchronises. If there is no clock adjustment, the player object does not interpolate properly when moving.

Improvements

I am happy with the network play experience for a basic real-time game experience. However, there is certainly more technology you could apply. If you want to learn about some really advanced techniques, check out Valve's writeup on the subject.

In the next part I will be discussing about how to add server-side security to the system to prevent cheaters deliberately screwing with their clocks or spawning illegal bullets etc.

Code

the complete construct 2 demo code is here.

the construct 2 plugin for Firebase is here.