Re: Converting AudioTimeStamp between two devices
Re: Converting AudioTimeStamp between two devices
- Subject: Re: Converting AudioTimeStamp between two devices
- From: Jeff Moore <email@hidden>
- Date: Thu, 1 Jun 2006 20:15:59 -0700
On Jun 1, 2006, at 6:00 PM, Andrew Kimpton wrote:
I need to make sure that my captured audio is in perfect sync with
the played/output audio. In order to correct for system delays I
use a comparison between the 'inInputTime' timestamp and the
'inOutputTime' timestamp to establish when the output audio could
be heard at the input port.
This works well with a single callback on a single device (and for
aggregate devices), if I loop output to input my recording is in sync.
Yup. On a single device, you only have one clock to deal with.
However when the two devices are different subtracting the sample
times from the two devices would not be valid (correct ?)
Nope. In this case, you have two clocks to deal with. Each clock can
have it's own idea of where 0 was and they are running at different
rates, so the relationship between the clocks is more complicated.
As you note, an aggregate device solves relating the clocks between
the devices and dealing with the synchronization issues so you don't
have to.
So I'm attempting to convert the output timestamp to an input
timestamp using AudioDeviceTranslateTime() as follows :
AudioTimeStamp firstPlayTime = iFirstPlayTimeStamp;
AudioTimeStamp firstPlayTimeForInputDevice;
firstPlayTime.mFlags = kAudioTimeStampSampleHostTimeValid;
memset(&firstPlayTimeForInputDevice,0,sizeof
(firstPlayTimeForInputDevice));
firstPlayTimeForInputDevice.mFlags = kAudioTimeStampSampleTimeValid;
OSStatus osStatus = AudioDeviceTranslateTime
(iInputDeviceID,&firstPlayTime,&firstPlayTimeForInputDevice);
This 'works' in that I get no error and my
firstPlayTimeForInputDevice is updated, however the actual value
doesn't seem to be converted, rather it's the same as the
firstPlayTime.mSampleTime.
That doesn't seem right to me - I would expect the values to be
different between the two devices, furthermore actual testing this
with a loopback between output and input doesn't give correct
results (the recorded input is out of sync).
Is this the correct way to establish what the sample time on one
device would be on a second device ? Or is there some alternative
way ? Should I be specifying more flags ? iFirstPlayTimeStamp is a
straight copy of the HAL callback time stamp from the play/output
callback so it does have a full complement of sample times, host
times and rate scalar info.
The problem with your code is that you are providing too much
information in the first time stamp. You are saying, in effect,
"here's a time stamp with both a valid sample time and a valid host
time, please give me the sample time". The HAL is happy to oblige.
What you really want to say is "here's a time stamp with a valid host
time, please translate that to a sample time." To do this, you just
need to set kAudioTimeStampHostTimeValid in the flags field of
firstPlayTime.
BTW, simply translating the time stamps is not going to be enough to
keep things in synch. That just allows you to get things to start out
in synch. Since the two devices are running at different rates, you
will also need to deal with the clock drift over time.
--
Jeff Moore
Core Audio
Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden