Creating countdown timer

I am trying to create a countdown timer in Corona SDK that dislays the minutes, seconds and centiseconds left. I have tried the following:

local centiSecondsLeft = 1*6000 + 10*100
local clockText = display.newText("", display.contentCenterX,
display.contentCenterY-100, native.systemFont, 48)
local minutes
local second
local centiSeconds

local function updateTime()

    minutes = math.floor(centiSecondsLeft/6000)
    seconds = math.floor((centiSecondsLeft-(minutes*6000))/100)
    centiSeconds =((centiSecondsLeft-(minutes*6000))%100)
    clockText.text = string.format("%02d:%02d:%02d", minutes, seconds, centiSeconds)
    centiSecondsLeft = centiSecondsLeft - 1

end

timer.performWithDelay(10, updateTime, centiSecondsLeft)

 

Although the timer displays the time, a second seems to take more than a second and the timer stops when 1 centisecond is still left. What can I do about it?

Please help.

First, the timer functions are not guaranteed to be called at exactly the specified interval (I forget the specific reason but it’s going to be close, just not exact.)

Also, you’re decrementing the ‘centiSecondsLeft’ value after you’ve updated the display values - try moving that to the top of the function.

At 60 fps the fastest screen draw will be 1000/60 = 16.6ms.  So even if your timer was able to always fire on 10ms intervals that will cause issues. Stick to maybe tenths of a second?

timer events are only “considered” once per frame

thus, resolution is never more accurate than frame rate (and which will fluctuate, btw)

if you need more accurate elapsed or delta times (or accumulated, or decremented, or etc) then you should use true time functions (os.clock, os.time, system.getTimer, etc) and calculate it between each timer event (ie “eventElapsed = now-previous”) , or since the start of the entire period (ie “totalElapsed = now-start”).

edit, for clarification:

imagine that “timer.performWithDelay” were instead named “timer.performOnNextFrameAfterAtLeastThisMuchDelay” (because that’s how it actually works internally) then perhaps it’d be more obvious why your method isn’t achieving the desired accuracy.

You can confirm this by this single line of code

timer.performWithDelay( 10, function() print(system.getTimer()) end, 100 )

on 30 fps it will take around 3 seconds to complete and on 60fps it will take around a second and a half.

First, the timer functions are not guaranteed to be called at exactly the specified interval (I forget the specific reason but it’s going to be close, just not exact.)

Also, you’re decrementing the ‘centiSecondsLeft’ value after you’ve updated the display values - try moving that to the top of the function.

At 60 fps the fastest screen draw will be 1000/60 = 16.6ms.  So even if your timer was able to always fire on 10ms intervals that will cause issues. Stick to maybe tenths of a second?

timer events are only “considered” once per frame

thus, resolution is never more accurate than frame rate (and which will fluctuate, btw)

if you need more accurate elapsed or delta times (or accumulated, or decremented, or etc) then you should use true time functions (os.clock, os.time, system.getTimer, etc) and calculate it between each timer event (ie “eventElapsed = now-previous”) , or since the start of the entire period (ie “totalElapsed = now-start”).

edit, for clarification:

imagine that “timer.performWithDelay” were instead named “timer.performOnNextFrameAfterAtLeastThisMuchDelay” (because that’s how it actually works internally) then perhaps it’d be more obvious why your method isn’t achieving the desired accuracy.

You can confirm this by this single line of code

timer.performWithDelay( 10, function() print(system.getTimer()) end, 100 )

on 30 fps it will take around 3 seconds to complete and on 60fps it will take around a second and a half.