Re: NSTimer: Test Results
Re: NSTimer: Test Results
- Subject: Re: NSTimer: Test Results
- From: Andreas Grosam <email@hidden>
- Date: Tue, 17 Feb 2009 12:57:40 +0100
On Feb 17, 2009, at 12:33 AM, Graham Cox wrote:
On 17 Feb 2009, at 6:39 am, Andreas Grosam wrote:
So, according the documentation, it is neither method A or method
B which is used to calculate the firing time. Actually it is:
Let ts(i) be the scheduling times, where ts(0) is the initial
scheduled time, T is the interval and i refers to the ith timer
event, i >= 1.
ts(i) = ts(i-1) + T
I expected that the scheduling time has a very small inaccuracy,
however a quick test shows that the error is quite large:
For T = 1 sec, I got an offset of -1 msec roughly every 25 events.
This will accumulate to 150msec after an hour.
This large error cannot be explained due to rounding errors or due
to precision limits using doubles as the time value. I measured the
time with [NSDate timeIntervalSinceReferenceDate].
So, where does the error come from?
Interestingly, I was able to compensate the error by setting the
timer interval to 1.0 + 0.001/25.
After using this interval, the measured error almost disappeared,
and the events pretty exactly got fired every second.
Without knowing exactly what you're trying to do, I still suggest
you don't rely on the timer firing at a precise known time to
calculate timing intervals.
The timer triggers animations which run using its own frame rate. The
animations are short (<2 sec), but the trigger itself should be
precise (+-50ms) over a long duration.
Instead measure the elapsed time and use that value to perform your
calculations. That way you'll be insulated from variations in the
timer firing period, which can be affected by a wide variety if
external influences.
The animation itself is decoupled already, so once started they
(should) run "smooth".
For example in a physics simulation where you want to compute an
object's position based on its velocity, you can drive that with a
timer but by calculating the elapsed time you can place the object
accurately even if the timer wanders all over the place. If you
assume the timer is firing at a constant rate and basing your
calculations on that, you'll find the velocity/position will wander
with the timer. Typically the timer will fire at roughly the maximum
frame rate you want to go at, but if it can't maintain it, your
simulation won't suffer in accuracy, only in resolution (i.e it'll
drop frames as needed to keep up).
This would be the solution if I need to calculate each frame and the
corresponding model state based on the current real time. In other
parts of my problem, or If I have to use more lower level frameworks
(OpenGL), I probably need to use this method.
Nonetheless, the former mentioned animations that run in its own
thread are triggered based on a more or less precise (+-50ms) timer. I
can accept this jitter (roughly three frames), but no offset that
accumulates over time. So I think, I can easily solve the problem by
using
a) the same approach, but adjusting the timer (eliminating the
accumulated offset) every once in a while.
b) a one-shot timer, and scheduling a new one-shot timer firing on t0
+ n*T.
The timer could also run in its own thread, not the main thread.
Sure, there is still unknown inaccuracy when the timer actually fires
and when the handler will be eventually invoked - which you have
already mentioned. Alas, nothing is perfect, and if I can achieve +-50
ms accuracy, it actually is.
Thank you for your suggestion
Regards
Andreas
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden