Measuring hardware throughput of a physically looped pulse
Measuring hardware throughput of a physically looped pulse
- Subject: Measuring hardware throughput of a physically looped pulse
- From: Heinrich Fink <email@hidden>
- Date: Tue, 20 Dec 2011 11:49:45 +0100
Dear list readers,
In order to understand timing with the AUHal and the timestamps it provides better, I have written a test application that measures the difference of input/output host times of a generated pulse signal that is physically looped back. I think I am getting reasonable results, but I would greatly appreciate a second opinion here.
The signal path looks like the following:
1.] The AUHal instance configured with the device for output asks for samples.
2.] A pulse is generated (every 2 seconds). The sample time of the first pulse sample is translated into host time using AudioDeviceTranslateTime for the output device.
3.] The output signal is physically looped back to the input of another audio device (could be the same as well).
4.] The AUHal configured with the input device provides samples via the input callback.
5.] When the first sample of a pulse is detected in the input callback, its input sample time is translated into host time using AudioDeviceTranslateTime on the input device.
6.] The difference between the host times retrieved in 2.] and 5.] is calculated, converted into ms and printed out.
The host times calculated in 2.] and 5.] should be as close to the actual analog voltage input/output time as possible. Therefore, the diff in 6.] should as close to zero as possible.
I am seeing a reported difference of 0.36 ms to 0.44 ms on an Early 2011 MacBook Pro, OSX 10.7.2, using "Built-In Output" as the output device and “Built-In Input” as the input device. I am not completely sure about how close the measured timestamps could possibly be, i.e. which are the latencies that are out of CoreAudio’s control and still remain in the difference.
I know that latency in general and using CoreAudio’s clock has been discussed on this list several times which I have studied during the past few days. I have therefore considered the following in my test application (hopefully without drawing any wrong conclusions from my research):
The AUHal for the output device has kAudioOutputUnitProperty_StartTimestampsAtZero disabled in order for AudioDeviceTranslateTime to work properly.
The input sample time to AudioDeviceTranslateTime in 2.] is actually
P_out_s + L_out_hw + L_out_s
P_out_s = HAL sample time of the pulse’s first sample that will be set to 1.0
L_out_hw = kAudioDevicePropertyLatency of the output device
L_out_s = output stream latency, if any
The input sample time to AudioDeviceTranslateTime in 5.] is actually
P_in_s - L_in_hw - L_in_s
P_in_s = HAL sample time of the detected first sample that is over a certain threshold
L_in_hw = kAudioDevicePropertyLatency of the input device
L_in_s = input stream latency, if any
Again, any second opinion / corrections are greatly appreciated! The way timing works with CoreAudio is both fascinating and hard to learn properly (as many of you probably already know…).
best regards,
Heinrich Fink _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden