• Welcome to the Speedsolving.com, home of the web's largest puzzle community!
    You are currently viewing our forum as a guest which gives you limited access to join discussions and access our other features.

    Registration is fast, simple and absolutely free so please, join our community of 40,000+ people from around the world today!

    If you are already a member, simply login to hide this message and begin participating in the community!

Timing Precision: Centiseconds or Milliseconds?

Best precision to use in timer apps:

  • Centiseconds (100th of a second)

    Votes: 17 27.9%
  • Milliseonds (1000th of a second)

    Votes: 13 21.3%
  • Measure in milliseconds, display in centiseconds

    Votes: 26 42.6%
  • Other - please explain...

    Votes: 5 8.2%

  • Total voters
    61

Cride5

Premium Member
Joined
Jan 27, 2009
Messages
1,228
Location
Scotland
WCA
2009RIDE01
I have a dilemma with a JavaScript timer I'm developing. The timing functions available in JS are capable of measuring time accurate to 1000th of a second. However, most folk's generally seem to be happy with times precise to 100th of a second. This is the format used in most timers currently available and is also the format used by the WCA, presumably because that's the highest level of precision available using StackMat timers.

My question is, if its possible to measure time in milliseconds would this be preferred, or is hundredths adequate?

Another option is to measure all times (and calculate averages) in milliseconds, but then display them in hundredths by rounding. The problem with this is that it can result in a discrepancy between displayed singles and the calculated average.

So in short, what would you guys prefer: milliseonds; centiseconds; or measured in milli, displayed in centi?
 

qqwref

Member
Joined
Dec 18, 2007
Messages
7,834
Location
a <script> tag near you
WCA
2006GOTT01
YouTube
Visit Channel
Yep, measure in milliseconds, and display in centiseconds (or have an option letting the user to display in centi or milli). qqTimer does this for a reason :)

Sometimes you don't want the precision, but sometimes it's very useful to have... ever get an average that rounded to something like 20.00? [I had a 1x.997 average of 100 once.]
 

ove

Member
Joined
Dec 14, 2009
Messages
5
WCA
2009VERM01
I voted "other", because in my wildest dreams I would be happy if any computer based timer were able to even guarantee tenth second accuracy.

Basically, this is due to the accuracy of the os timers used for multitasking (usually 20 milliseconds). Also, keyboard is not a "real time device", since a delay of 100 milliseconds in a keystroke wouldn't even be noticeable by human (who types 10 letters a second ?!).

With any software based timer, accuracy depends on machine load, and on what happens in the machine. If the hard disk or virus software starts during your solve, or with slow or loaded machines, you may well have 200 or 300 milliseconds error on the final time (already saw that on my netbook with some heavy java timer)
All you can expect is that, most of the time on a reasonable machine, the first number (thenths) is ok. For the second number, well... I would bet for an average error around 4 or 5 digits.
 

Cride5

Premium Member
Joined
Jan 27, 2009
Messages
1,228
Location
Scotland
WCA
2009RIDE01
Looks like the general consensus is measure in milli, display in centi ... which is cool cos its already like that and means I have no extra work to do :p

However, I may (eventually) code in the ability to display all times in milliseconds as an option if folks think that will be useful. I'll also explain why sometimes the averages/totals don't always appear to add up in the FAQ.

@ove, I was going to mention that in the original message but I figured it had gotten wordy enough already. If your program is swapped out when you slam your hand on that spacebar, it has to wait for the OS to swap it back onto the processor and take the time measurement. This could take any amount of time and isn't predictable. Avoiding it would require a program with exclusive access to the processor - which is basically impossible. Best remedy for that is to use a stackmat. I'm hoping to add interfacing to a stackmat as an option in future...

EDIT: The keyboard delay probably isn't a huge issue though, because its probably relatively constant. I.e. it will start with a 100ms delay, then also stop with a similar 100ms delay.

@tim, interesting article. I guess that accuracy of time measurements not only depends on the kernel's scheduling of processes, but also any schedulers running in the browser/javascript engine too. IE performed predictably badly, yet another reason to avoid it like the plague! No surprises that Mac OS came out on top tho :D
 
Last edited:

Stefan

Member
Joined
May 7, 2006
Messages
7,280
WCA
2003POCH01
YouTube
Visit Channel
Just to illustrate the discrepancy when measuring millis and showing centis... something like this could happen:

(9.99)
9.99
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
(20.00)
=====
9.99 average

(5.00)
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.00
10.01
(10.01)
=====
10.01 average
 
Last edited:

Stefan

Member
Joined
May 7, 2006
Messages
7,280
WCA
2003POCH01
YouTube
Visit Channel
As I see it, the only difference between Centiseconds and Measure in milliseconds, display in centiseconds is the average, for single times there's no difference. And the average can at most differ by 0.01 seconds. In cases where it does differ by those 0.01, measureMillis+showCentis would be slightly more accurate but at the cost of looking confusing. That's the reason I opted for centiseconds in my timer (i.e. I measure in millis because that's what the environment gives me, but I immediately round to centis and work with that rounded value then).
 

qqwref

Member
Joined
Dec 18, 2007
Messages
7,834
Location
a <script> tag near you
WCA
2006GOTT01
YouTube
Visit Channel
Basically, this is due to the accuracy of the os timers used for multitasking (usually 20 milliseconds). Also, keyboard is not a "real time device", since a delay of 100 milliseconds in a keystroke wouldn't even be noticeable by human (who types 10 letters a second ?!).

I regularly achieve speeds of 10 characters/sec (~120wpm) while typing and there are many people who are even faster than that; as a rhythm game player I often play files that require keypresses at 15+ keys per second. A lag of 100ms is easily noticeable, depending on the application.

I do agree that there are situations in which the timer gets delayed, but most of the time for me it seems to be pretty accurate (or at least the lag is consistent enough that the difference in the computer-measured time and the real time is small).
 

ove

Member
Joined
Dec 14, 2009
Messages
5
WCA
2009VERM01
Avoiding it would require a program with exclusive access to the processor - which is basically impossible. Best remedy for that is to use a stackmat. I'm hoping to add interfacing to a stackmat as an option in future...

I remember having read documentation on MSDN about "high precision timers" (in DirectX stuff) wich would provide 1 millisecond accuracy, for use with special devices like MIDI hardware and joysticks. But I'm not sure it would worth it anyway (certainly more diffcult to use than to just add stackmat support)
 

EE-Cuber

Member
Joined
Mar 12, 2009
Messages
29
Location
Near Cleveland, OH USA
Measure in milliseconds is useless. When doing a solve, the amount of time it takes between finishing and smacking the stop button is on the order of milliseconds (10's).. so this is lost in the "noise."
 

Stefan

Member
Joined
May 7, 2006
Messages
7,280
WCA
2003POCH01
YouTube
Visit Channel
In cases where it does differ by those 0.01, measureMillis+showCentis would be slightly more accurate but at the cost of looking confusing.

Not more accurate by 0.01, btw! I think this is the extreme case:

(4.985 show 5.00)
4.985 show 4.99
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
4.995 show 5.00
(4.995 show 5.00)

Average the millis:
4.994 show 4.99, off by 0.004

Average the centis:
4.999 show 5.00, off by 0.006

So in the absolute extreme case, averaging the millis rather than the centis is more accurate by only 0.002 seconds. Admittedly it's more for average-of-5:

(4.985 show 5.00)
4.985 show 4.99
4.995 show 5.00
4.995 show 5.00
(4.995 show 5.00)

Average the millis:
4.991666... show 4.99, off by 0.001666...

Average the centis:
4.99666... show 5.00, off by 0.008333...

So in the absolute extreme case, averaging the millis rather than the centis is more accurate by only 0.00666... seconds.

Again, these are the extremes, centiseconds being worse by 0.002 or 0.00666 seconds. This will rarely happen, usually both versions will show the same average result, and when they do differ, the actual amount by which averaging milliseconds is more accurate will be smaller than 0.002/0.00666. I feel this is not worth the hassle of showing confusing values and having to explain them.
 

Cride5

Premium Member
Joined
Jan 27, 2009
Messages
1,228
Location
Scotland
WCA
2009RIDE01
I've considered the opinions posted and decided to go for centiseconds across the board. The two main driving reasons for it were:
(1) The sizes of errors generated by scheduling, unpredictable lag and other sources mean millisecond precision is unwarranted and
(2) Inconsistency in averages and totals is not really acceptable, even if explained. It just looks as if there is something wrong with the code, making it less trustworthy.

Thanks for your views.
 
Top