CoreAudio: Calculate total latency between input and output with kAudioUnitSubType_VoiceProcessingIO
CoreAudio: Calculate total latency between input and output with kAudioUnitSubType_VoiceProcessingIO
- Subject: CoreAudio: Calculate total latency between input and output with kAudioUnitSubType_VoiceProcessingIO
- From: Eric Herbrandson via Coreaudio-api <email@hidden>
- Date: Mon, 20 Jan 2020 08:09:18 -0600
I am working on an application using CoreAudio on the iPhone/iPad. The
application both plays audio through the speakers (output) as well as records
audio from the microphone (input) at the same time. For the purposes of this
application it is extremely important that I be able to compare the input and
output, specifically how well they "line up" in the time domain. Because of
this, correctly calculating the total latency between the input and output
channels is critical.
I am testing across 3 different devices. An iPhone, an iPad, and the simulator.
I've been able to empirically determine that the latency for the iPhone is
somewhere around 4050 samples, the iPad is closer to 4125 samples, and the
simulator is roughly 2500 samples.
After much research (aka googling) I found a smattering of discussions online
about calculating latency in CoreAudio, but they generally pertain to using
CoreAudio on OSX rather than iOS. Because of this, they refer to various
functions that do not exist on iOS. However, it seems that for iOS the correct
solution will be to use AVAudioSession and some combination of the
inputLatency, outputLatency, and IOBufferDuration. However, no combinations of
these values seem to add up to the empirically determined values above. In
addition, I get wildly different values for each parameter when I check them
before vs. after calling AudioUnitInitialize. Even more confusing is that the
values are much closer to the expected latency before the call to
AudioUnitInitialize, which is the opposite of what I would expect.
Here are the values I am seeing.
iPad (before): in 0.032375, out 0.013651, buf 0.023220, total samples 3054
iPad (after): in 0.000136, out 0.001633, buf 0.023220, total samples 1102
iPhone (before): in 0.065125, out 0.004500, buf 0.021333, total samples 4011
iPhone (after): 0.000354, out 0.000292, buf 0.021333, total samples 969
The simulator always returns 0.01 for in and out, but I suspect these aren't
actual/correct values and that the simulator just doesn't support this
functionality.
One other potentially interesting note is that I'm using
kAudioUnitSubType_VoiceProcessingIO rather than kAudioUnitSubType_RemoteIO
which I do expect to add some additional latency. My assumption is that this
would be included in the inputLatency value, but perhaps there's another value
I need to query to include this?
What's the correct way to determine the total latency between input and output
in iOS?
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden