timer ISSUE - it is not precise - cannot achieve 1000 ms delay

Hi,

Simple code:
[lua]local timeStamp = system.getTimer()
local tDelta

local function update(event)

–tDelta = system.getTimer() - timeStamp
–timeStamp = system.getTimer()

tDelta = event.time - timeStamp
timeStamp = event.time

print ("tDelta = "…tDelta)
end

timer.performWithDelay(1000, update, 0)[/lua]

Console output for time = 1000ms
tDelta = 1022.673
tDelta = 1023.055
tDelta = 1023.21
tDelta = 1022.749
tDelta = 1023.493

Ok - so I thought that I would change 1000ms to 980 - so I will get 1000ms between each iteration.
NOPE!
Console output for time = 980ms
tDelta = 990.263
tDelta = 990.627
tDelta = 990.035
tDelta = 989.108
tDelta = 989.844
tDelta = 990.24

Ok - last go - I changed to 990ms
Console output for time = 990ms
tDelta = 1023.274
tDelta = 990.602
tDelta = 1022.069
tDelta = 990.303
tDelta = 990.042
tDelta = 1022.978
tDelta = 990.552
tDelta = 1021.984

It does not make sense to me! Why is it 1023 one time and 990 the other?

I used a simple code as example here.

The real problem I have is that I work on a game and there is 60seconds mode - so I have something like
[lua]timer.performWithDelay(1000, gameUpdate, 0)[/lua]
That gameUpdate function updates timer on the screen, execute gameover when timer reaches 0 etc. It doesn’t matter here.
I tested times exactly the same for my game as I did with the sample code above.
It was usually 1008ms between each iteration.
After 60 iterations - game lasted 60.5 seconds instead of 60.
I tested that on simulator, iPhone 4s, iPad 3 - game lasted from 60,450 ms to 60,700 ms

Can anyone shed some light on this issue?

P.S I worked on Build: 2012.912

Many thanks
Patryk

EDIT 1: corrected code (was 990ms, should be 1000ms)

EDIT 2:
Just for your information - testing my game on the simulator gave these results

time = 1000ms
tDelta= 1008.019
tDelta= 1008.104
tDelta= 1008.073

time = 990 ms
tDelta= 992.194
tDelta= 991.591
tDelta= 991.873

time = 980 ms
tDelta= 992.298
tDelta= 991.958
tDelta= 992.221
Note: no it is not a mistake - 990ms and 980ms give the same result

Any idea? [import]uid: 51816 topic_id: 31321 reply_id: 331321[/import]

That’s actually pretty common with timers that work in this manner. I had the same issue with Delphi and c#. It has something due with cycles blah blah. Three options I can think of are:

  1. Compare two different timestamps inside of an enterFrame or a quick timer event. 1 timestamp would be from your previous cycle and the second would be thee current one. I can’t recall the api call for getting the timestamps so you will have to look that up :).

  2. Change your timer to be 100ms. You will still have a slight inaccuracy but it should help a tiny bit.

  3. This is my recommendation… Unless you REALLY need that absolute precision I would just stick with the 1000ms timer(or lower it to 100ms and adjust your numbers accordingly). No one is going to notice that your game lasted 60.x seconds instead of 60. [import]uid: 147305 topic_id: 31321 reply_id: 125192[/import]

Are you running at 30 fps? 1000/30~=33. So the precision of your timers is within 33 milliseconds. Which is what you’re seeing :slight_smile:

If you go to 60 fps, you will cut it to 17 milliseconds. A little better. [import]uid: 160496 topic_id: 31321 reply_id: 125216[/import]

Now I understand.
I needed absolute precision for soundtrack sync
So it seems that I cannot count on timers precision in this matter as there is no such thing! :wink:

Anyway, thank you guys for your time and explaining this to me. I will have to think about other ways to design my engine in the future.

All the best.
Patryk [import]uid: 51816 topic_id: 31321 reply_id: 125257[/import]

That’s actually pretty common with timers that work in this manner. I had the same issue with Delphi and c#. It has something due with cycles blah blah. Three options I can think of are:

  1. Compare two different timestamps inside of an enterFrame or a quick timer event. 1 timestamp would be from your previous cycle and the second would be thee current one. I can’t recall the api call for getting the timestamps so you will have to look that up :).

  2. Change your timer to be 100ms. You will still have a slight inaccuracy but it should help a tiny bit.

  3. This is my recommendation… Unless you REALLY need that absolute precision I would just stick with the 1000ms timer(or lower it to 100ms and adjust your numbers accordingly). No one is going to notice that your game lasted 60.x seconds instead of 60. [import]uid: 147305 topic_id: 31321 reply_id: 125192[/import]

Are you running at 30 fps? 1000/30~=33. So the precision of your timers is within 33 milliseconds. Which is what you’re seeing :slight_smile:

If you go to 60 fps, you will cut it to 17 milliseconds. A little better. [import]uid: 160496 topic_id: 31321 reply_id: 125216[/import]

Now I understand.
I needed absolute precision for soundtrack sync
So it seems that I cannot count on timers precision in this matter as there is no such thing! :wink:

Anyway, thank you guys for your time and explaining this to me. I will have to think about other ways to design my engine in the future.

All the best.
Patryk [import]uid: 51816 topic_id: 31321 reply_id: 125257[/import]