Re: AudioTimeStamp in AURenderCallback
Re: AudioTimeStamp in AURenderCallback
- Subject: Re: AudioTimeStamp in AURenderCallback
- From: "Dave O'Neill" <email@hidden>
- Date: Sun, 12 Apr 2015 00:18:36 -0700
Thank you for helping me clear this up. I measured by comparing mHostTime with mach_absolute_time() - IOBufferDuration - outputLatency. I'm using a RenderNotify on the remoteIO on iOS. I'm seeing one timeStamp that is used for both input and output callback buffers, and that time stamp is set to about (IOBufferDuration + outputLatency) in the future. How can a future time stamp represent a buffer of samples from the microphone that has already been captured.
renderCallback(){
ViewController *self = (__bridge ViewController *)inRefCon;
SInt64 ticksDiff = inTimeStamp->mHostTime - mach_absolute_time();
float secsDiff = (ticksDiff * ticksToSeconds()) - self->ioBufferDuration - self->outputLatency;
printf("secs %f\n",secsDiff);
return noErr;
}
Thanks for the help
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden