os.time() on Android vs iOS

Hi,

I have a real-time multiplayer racing game and in my game it is very important for the race to start at the exact same time for all players. As it is right now, I have a server setup where 2-4 players get matched and the server sends a raceStartTime in seconds in UTC to all players (clients).

All of my players get this raceStartTime and then I use 

Runtime:addEventListener( "enterFrame", checkRaceStartTime )

to check if

os.time() == raceStartTime

and when it’s true, the race starts for each client/player. This works perfectly when I test it on my iPhone and Corona Simulator, the race does indeed start at the exact same time. You can see from this video that the scene and countdown loads at different times but GO! appears at the same time. Now the problem comes when I build the game and deploy it on Android. The Android starts late after the other devices, it appears to be 1 second behind in starting the race.

Does anyone know why that is? Does os.time() return a different timestamp from Android and iOS?

Thanks.

os.time() is  very  system-dependent (fe, just change the device’s time, nevermind the different os’s)

OK well that answers a few questions. How do I access the device’s time? And hopefully convert it to seconds in UTC.

Computers (including your phone and other digital devices) have clocks that run on an internal vibrating quartz crystal. These oscillate at slightly different rates (in the microseconds usually) and clocks will over time drift. Things like temperature, and other factors affect the accuracy and it’s not unusual to see drifts in the 1-2 second per day range.

Most computer operating systems today sync their clocks periodically off of some network time server. In an ideal world, these time servers sync from the US Naval Observatory’s atomic (very precise clock) and every network connected device should all have the exact same time. But the reality is things can cause times to get off by a few seconds. Operating system A might only sync on reboot. Operating system B might sync once an hour. Phones may sync from cell phone towers, while WiFi connected devices may get their time from their router which may get it from a generic network source. 

I personally would not determine on your connected devices to depend on their internal clock to control things. You can do things though like have your server send a NOW packet with the server’s time and then compute the difference between the two using os.time(). It might help to have an idea of how long it took that packet arrive from the server because depending on your networking technology, you could send up with enough of a packet lag to throw the time off by 1 second which is enough to make the 1 second resolution ineffective. 

Rob

That is what I do now, I send a raceStartTime packet with the server’s time in seconds in UTC and check it against the client’s time. I have an idea of how long it takes a packet to arrive from the server but I cannot depend on that because it’s different every time that’s why I send a startTime.

I guess I could adjust my game to take into account the slight differences in starting times but I am trying to minimize that difference as much as possible.

@Rob I’m trying to think of how this method would help me. The issue I am having is not what time the packet is received by the client (whether earlier or later), the issue I am having is the client’s os.time() is different. I am already sending a “NOW + 6” packet which is the server’s time plus 6 seconds to indicate when the race should start exactly but I have nothing to compare that time against since each device has a different os.time().

What good will it do me if a packet is received earlier or later to the client? How long it takes that packet to arrive from the server is irrelevant to the client’s os.time() The packet can arrive early and the os.time() could be either ahead of behind other devices and a packet can arrive late and the os.time() can still be either ahead or behind other clients.

I could send a packet from the first client to start the race but then I’ll still run into the issue of packets arriving earlier/later on each client.

How do other real-time race games handle this delay?

There are some interesting general discussions on this topic:

1. https://www.cakesolutions.net/teamblogs/how-does-multiplayer-game-sync-their-state-part-1

2. https://gamedev.stackexchange.com/questions/93477/how-to-keep-server-client-clocks-in-sync-for-precision-networked-games-like-quak

You can still send a “Now” packet from your server to your clients saying “This is the time I think it is”, then your client can say, “Oh, I’m off by 6 seconds” and when you get a start game packet, you can add/subtract your local clock difference to the time received by the client from the server and base your logic based on that.

However, I still don’t understand why you need to look at the time stamp.  Your server should collect players until your queue is full or X amount of time has elapsed and just send a “start” packet to indicate that the game has started.

Rob

@Rob thanks for the explanation.

I thought about that but before I begin implementing that, I was wondering wouldn’t the lag between the NOW packet being sent and the packet being received affect the time the client thinks it’s off? Let’s say we know the client os time is off by 2 seconds and the off packet took 1 second to arrive to the client. Won’t the client then think they are off by 3 seconds?

Same thing here, my issue is the time it takes for a packet to be received by each client.

I really appreciate your help :slight_smile:

We really haven’t discussed this but how are you communicating between your client and your server? TCP sockets? UDP sockets? Web sockets? network.request()?

If you’re using sockets, your transmit/receive times should be in the 1/10th of a second or less. I don’t have a lot of experience with websockets. Doing HTTP requests can be the slowest method.

Rob

As Rob states you only need to sync the start time.  So players are queueing in a lobby and you get your required 5 players.  So you broadcast a start message with the server time.  You then use that time in your app and ignore os.time()  from that point on you have an enterFrame() and you updated game time based on server start time + elapsed game time.

Problems will occur if your device render time per frame is > 16ms in which case you will need to adapt based on elapsed time.

I’m building my backend using GameSparks Real-Time Services and TCP sockets to ensure a reliable connection.

SGS you’re totally right, what I’m doing now is sending “NOW + 6” seconds from the server and then comparing that time against the os.time() but what you said makes much more sense. Just send NOW alone and then on the client side I countdown from 6s instead of counting on the os.time() to match “NOW + 6”…I don’t even think I need a NOW time anymore, I just start counting down the start of the race the instant a server matchMadeMessage is received.

After hacking away at this and following your suggestions I still have some delays between my devices in terms of raceStartTime which is significant enough to hinder the gameplay in my race BUT I came across this GameSparks Clock Synchronization tutorial which I’ll follow and implement.

I hope that’ll do it!

Thanks for all your help guys :slight_smile:

@elifares,

Hi.  I don’t want to misspeak here, but you can’t reasonably expect your players to be tightly synchronized unless you’re using client-server setup with an authoritative server (the server tracks all objects and is solely responsible for deciding their positions and interactions.)

What I’m saying is if you have two players A and B attached to a server you can expect this kind of situation:

  • At time 0, player A moves right by 10 pixels and sends the updated position to the server.
  • The flight time to the server is 50 ms.
  • The server then immediately (not likely) sends this update out to player B.
  • The flight time for this update is 32 ms.
  • By the time player B sees the move, it has been a minimum of 82 ms and this doesn’t include frame processing delays, queueing, etc.

This is nearly a 10th of a second latency.  Players moving at a high-rate of speed will get out of sync and show movement hiccups quickly.

In the simplest version of a true client-server set-up, the clients merely give inputs and the server tells both clients about the update at the same time, so the interaction would go like this:

  • At time 0, player A tells the server move me right by 10 pixels.
  • The flight time to the server is 50 ms.
  • The server (which is tracking all objects in the game) updates the position of player  A. 
  • The server then immediately (not likely) sends this update out to player A and B.
  • The flight time to player A is 50 ms.
  • The flight time to player B is 32 ms.
  • Both players will receive the updated position and draw player A in the new position within 50-32 == 18ms of each other.  Thus, the update will occur on both devices within 1 or 2 frame of each other.

One more thing.  You’d be better of using UDP.

Yes, you may lose some packets, but TCP queues up packets before sending them and this can inject extra latency in your data transmission.

This can be even worse if ‘nagles algorithm’ comes into play anywhere along the route.

https://en.wikipedia.org/wiki/Nagle%27s_algorithm#Interactions_with_real-time_systems

I think you’re talking about mobile gaming here, but if you are talking about desktop gaming, be aware Windows 7 and 10 both have nagles algorithm enabled by default.

Finally, please give us the results of following that synchronization tutorial. 

I was going to start talking about this, but then re-read your post and was happy to see you found a discussion (and a tutorial too!) about the topic.

Thanks @roaminggamer for your detailed post.

I understand there will be delays between player updates and I’m fine with that. That is not the end of the world in my case, the race can handle some lag between player position updates.

My only focus is to start the race at the same time (or as close as possible), after that, I am fine with some delays and out of sync packets which seems to be unavoidable. I just need the START to be synchronized.

I will certainly post the outcome of that tutorial I posted.

Cheers!

I’ve run into a hiccup. I need to be able to compare the system clock time to the server’s time but lua only allows me to get the time in seconds and not milliseconds.

I want to calculate the round-trip time it takes to send a packet and receive one back but I can’t if I’m only working with seconds (the time it takes to send/receive packets happens on the millisecond level) Does anyone know how I could get the os.time() in milliseconds? Or as close as possible to?

I understand that at 60fps I am only going to get 16.6667 millisecond increments but that is still better than 1 second increments.

I’m following this thread but can I use that logic to get the os.time() in milliseconds?

See system.getTimer():  http://docs.coronalabs.com/api/library/system/getTimer.html

Hi Rob and roaminggamer,

That’s exactly what I did, I’m using system.getTimer() as the client time and IT WORKS! I don’t really care for os.time() to achieve what I need. So straight from the GameSparks clock sync tutorial here’s what I did to sync the clocks:

  1. Send a packet with the client’s local timestamp (using system.getTimer() ) to the server.
  2. The server takes this timestamp and adds their own local time to the packet, and sends this back to the original sender.
  3. So the server sends the client’s timestamp and server timestamp both in milliseconds
  4. The client receives this packet and compares their original sent time to their current time to get the round-trip time.
  5. Half the round-trip calculates the latency.
  6. Subtract the server-time from the client’s local time (in my case the system.getTimer() ) to get the difference between server’s time and client’s time (that is: serverDelta and obviously this delta is going to be a big negative number and that’s OK).
  7. We can then use the serverDelta plus the latency to adjust any time coming from the server to what it is when we received it, therefore syncing all clients to the same time.

I can repeat this process to update the client’s clock locally and using the server time to re-align the clock for any discrepancies but like I said earlier, I only care about starting the race at the same time and not so much about discrepancies once the race starts so going through those steps onMatchFound works just how I need it.

Maybe in future updates I can update the client’s clock when I need to. Thanks everyone for all your help!