Timing in a user-land driver
Timing in a user-land driver
- Subject: Timing in a user-land driver
- From: Stéphane Letz <email@hidden>
- Date: Thu, 3 Apr 2008 14:39:01 +0200
Hi,
Our user-land driver has to define correct timing informations to be
given as parameters when calling the application IOProc. So we need to
fill a AudioTimeStamp structure with the appropriate mSampleTime,
mHostTime, mRateScalar values. We were just using the
CAHostTimeBase::GetTheCurrentTime(); to fill the mHostTime, setting a
1.0 value for mRateScalar and using a function that *estimate* the
device real frame position when our device audio callback is called.
It means the estimated frame time was not always exactly separated by
our device buffer size at each callback.
This timing behaviour was actually causing problems with QuickTime
that was dropping a sample from time to time when playing audio
files. By correcting the mSampleTime to just increment by on complete
buffer size at each callback, everything seems to work now.
I just wanted to be confirm that this way of computing timing is the
right one?
Thanks
Best Regards
Stephane Letz
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden