Extremely accurate timer

Let’s say you were going to hold a competition and you needed to time people as accurately as possible (to the millisecond if possible).

What way is best across all platforms?

I just read system.getTimer is not necessarily accurate to the millisecond on Android devices.

To me, from a mathematician’s standpoint the best way to get as accurate is as follows

Step A: Get the starting time (down to the millisecond)  (I thought using system.getTimer())

Step B: Every millisecond or 2 get new time.  Subtract starting time from this time and that is elapsed time - show this in a window.

Step C: Keep repeating Step B until they are finished.

Step D: Get stop time (down to the millisecond).  Subtract starting time from this value - that will give us the exact run time. 

Step B and C is just for show…

So is getTimer the best way?  Just how imprecise can it get in Android?

Or to be 100% fair should I just measure time to the tenth of a second and use getTimer?

Thanks!

Out of curiosity, where did you read that system.getTimer() is not accurate to the millisecond? According to the documentation, it returns the number of milliseconds since the app launched, and on some devices, there’s even a fraction signifying microseconds.

http://docs.coronalabs.com/api/library/system/getTimer.html

Right there on that page…

On Android, the values may be as accurate as a microsecond, but it depends on the device.

One millisecond == 1000 microseconds.

i.e.  Timer is far more accurate than to the millisecond.

Hm, yes, I see how it could be interpreted like that. To me, the fact that the function returns the number of milliseconds implies that it is, at least, accurate to the millisecond – and the presence of an optional fraction suggests potential extra accuracy (i.e. it depends on the device whether microsecond accuracy is available). I could be misunderstanding the documentation, but it seems unlikely that a device wouldn’t have a clock with at least millisecond accuracy.

You should be able to get millisecond accuracy out of any modern device considering the API call gets microsecond accuracy.

But keep in mind:  The screen can only update at 1/60th of a second is around 16ms per frame.   Human reactions to start and start/pause the timer is around 10ms for highly skilled people and around 100ms on average.  It reminds me of the saying:

“Measure with a Micrometer, mark with chalk, cut with an axe.”

Basically what’s the purpose of such accurate measurement when your controls are not that accurate.

Rob

Actually all I care about is 1/100ths of a second accuracy.   To time a test/competition that they do on the device.

The intermediary timer (what they see, I don’t care if perfectly accurate.)  but the final time (stop - start time) is what matters.

So, getTimer will work perfectly. 

Thanks everyone.

Out of curiosity, where did you read that system.getTimer() is not accurate to the millisecond? According to the documentation, it returns the number of milliseconds since the app launched, and on some devices, there’s even a fraction signifying microseconds.

http://docs.coronalabs.com/api/library/system/getTimer.html

Right there on that page…

On Android, the values may be as accurate as a microsecond, but it depends on the device.

One millisecond == 1000 microseconds.

i.e.  Timer is far more accurate than to the millisecond.

Hm, yes, I see how it could be interpreted like that. To me, the fact that the function returns the number of milliseconds implies that it is, at least, accurate to the millisecond – and the presence of an optional fraction suggests potential extra accuracy (i.e. it depends on the device whether microsecond accuracy is available). I could be misunderstanding the documentation, but it seems unlikely that a device wouldn’t have a clock with at least millisecond accuracy.

You should be able to get millisecond accuracy out of any modern device considering the API call gets microsecond accuracy.

But keep in mind:  The screen can only update at 1/60th of a second is around 16ms per frame.   Human reactions to start and start/pause the timer is around 10ms for highly skilled people and around 100ms on average.  It reminds me of the saying:

“Measure with a Micrometer, mark with chalk, cut with an axe.”

Basically what’s the purpose of such accurate measurement when your controls are not that accurate.

Rob

Actually all I care about is 1/100ths of a second accuracy.   To time a test/competition that they do on the device.

The intermediary timer (what they see, I don’t care if perfectly accurate.)  but the final time (stop - start time) is what matters.

So, getTimer will work perfectly. 

Thanks everyone.