Cride5
Premium Member
I have a dilemma with a JavaScript timer I'm developing. The timing functions available in JS are capable of measuring time accurate to 1000th of a second. However, most folk's generally seem to be happy with times precise to 100th of a second. This is the format used in most timers currently available and is also the format used by the WCA, presumably because that's the highest level of precision available using StackMat timers.
My question is, if its possible to measure time in milliseconds would this be preferred, or is hundredths adequate?
Another option is to measure all times (and calculate averages) in milliseconds, but then display them in hundredths by rounding. The problem with this is that it can result in a discrepancy between displayed singles and the calculated average.
So in short, what would you guys prefer: milliseonds; centiseconds; or measured in milli, displayed in centi?
My question is, if its possible to measure time in milliseconds would this be preferred, or is hundredths adequate?
Another option is to measure all times (and calculate averages) in milliseconds, but then display them in hundredths by rounding. The problem with this is that it can result in a discrepancy between displayed singles and the calculated average.
So in short, what would you guys prefer: milliseonds; centiseconds; or measured in milli, displayed in centi?