Hi all,
I have a question regarding precise timing in Core Audio on iOS, so here goes:
Question
I have a very simple iOS Core Audio application with the following structure:
Remote I/O Unit Input Bus --> Render Callback --> Remote I/O Unit Output Bus
The render callback function, invoked by the Remote I/O output bus, pulls samples from the input hardware by calling AudioUnitRender() on the Remote I/O input bus. It then processes/affects these samples, writing to the supplied AudioBufferList* and returns, causing the affected samples to be played via the output hardware. All works well.
My question is how can I know, or calculate, the precise time at which: - The samples were captured by the input hardware - The samples were actually played on the output hardware
Discussion
An AudioTimeStamp struct is passed into the render callback with valid mHostTime, mSampleTime & mRateScalar values. It is not clear to me exactly what this time stamp reflects. The documentation states: "inTimeStamp The timestamp associated with this call of audio unit render."
This sounds like it represents the time the render was invoked, but how does that relate (if at all) to the time at which the input samples were captured and the output samples will be rendered?
Several resources online speak of using mach_absolute_time() or CACurrentMediaTime() to calculate the current host time, however again I can't seem to make the connection from current host time to past or future host time.
The following quote from another coreaudio-api thread talks of three time stamps, including a separate time stamp for both input data in the past and output data in the future. This is exactly what I am looking for, however I believe this is running on OS X and using the AUHAL I/O. I cannot find a way of retrieving these time stamps on iOS. "So, the way CoreAudio works is that an I/O proc fires and gives you 3 time stamps: (1) Is the time stamp of the input data - if any of course. This will always be at least a buffer size in the past (2) Is the time stamp for now - when the I/O proc was woken up to run (3) Is the time stamp for the output data you will provide. This will is always some time in the future - usually it is a buffer size in the future. (http://lists.apple.com/archives/coreaudio-api/2005/Sep/msg00220.html)"
I have been directed towards inspecting input and output latency values on the audio session, as well as kAudioUnitProperty_Latency on the audio unit itself. This seems to be on the right track, however I still must know what the inTimeStamp represents before I can adjust it for any such latency values.
Thanks in advance, Andy. |