timer.performWithDelay() runs slower than specified interval

I’ve used timer.performWithDelay() extensively in my games, but I realized today that it is not invoking the listener at the specified interval.

Repro:

  1. Call timer.performWithDelay() and specify any interval from 10 to 100 with infinite iterations.
  2. The listener shall increment some counter variable by the interval passed to timer.performWithDelay() which tracks the elapsed time since the call.
  3. The expected behavior is to observe the counter variable increasing in real-time seconds equivalent to a stop watch.
  4. Observe that the counter variable is increasing  slower than real-time. That is, the counter variable reaches 10 seconds roughly 3-5 seconds after 10  real-time seconds.

Is this a known gotcha with this API call? Shouldn’t the listener be called at the specified interval and no slower (10 - 100 ms seems reasonable)?

I’m sure the staff can explain this better, but in a nutshell,

  1. The timer mechanism is not guaranteed to be exact.

  2. The SDK’s main loop is strongly tied to frames and frame rate.

  3. Activities (collision responses, other event responses, and timers) are regularly queued for linear execution.  i.e. Per any frame, there are 10’s, 100’s, 1000’s, … actions that must be executed in-order.

  4. As actions are executed, time passes.  Thus a timer may not happen exactly N-milliseconds later, but rather N+M milliseconds instead.  Think of this as ‘timer drift’.

  5. If you’re using a timer with 10 … 100 iterations, timer-drift becomes cumulative.  Thus adding up to larger and larger error in your final duration.

Ed you explained that pretty well. 

Rob

Thanks for the explanation; it does seem like a strange design decision to not decouple the SDKs main loop from frames.

What’s the best workaround for this if I wanted to have some sort of a cumulative timer (stopwatch) for a game? Just use the parameters passed in the event object in enterFrame()? (delta time)

Yes, your best bet is to track the start time via system.getTimer() and subsequently adjust your ‘stopwatch’ to show the true number is milliseconds since the start. You can do these updates per-frame or via timer.performWithDelay(), but be sure to always set the time as this:

local startTime = system.getTimer() ... local elapsedTime = system.getTimer() - startTime -- in milliseconds.

I’m sure the staff can explain this better, but in a nutshell,

  1. The timer mechanism is not guaranteed to be exact.

  2. The SDK’s main loop is strongly tied to frames and frame rate.

  3. Activities (collision responses, other event responses, and timers) are regularly queued for linear execution.  i.e. Per any frame, there are 10’s, 100’s, 1000’s, … actions that must be executed in-order.

  4. As actions are executed, time passes.  Thus a timer may not happen exactly N-milliseconds later, but rather N+M milliseconds instead.  Think of this as ‘timer drift’.

  5. If you’re using a timer with 10 … 100 iterations, timer-drift becomes cumulative.  Thus adding up to larger and larger error in your final duration.

Ed you explained that pretty well. 

Rob

Thanks for the explanation; it does seem like a strange design decision to not decouple the SDKs main loop from frames.

What’s the best workaround for this if I wanted to have some sort of a cumulative timer (stopwatch) for a game? Just use the parameters passed in the event object in enterFrame()? (delta time)

Yes, your best bet is to track the start time via system.getTimer() and subsequently adjust your ‘stopwatch’ to show the true number is milliseconds since the start. You can do these updates per-frame or via timer.performWithDelay(), but be sure to always set the time as this:

local startTime = system.getTimer() ... local elapsedTime = system.getTimer() - startTime -- in milliseconds.