Re: Audio/Video synchronization
Re: Audio/Video synchronization
- Subject: Re: Audio/Video synchronization
- From: Rahul <email@hidden>
- Date: Mon, 12 Jun 2006 21:20:07 +0530
Hi Jeff,
Thank you for your inputs
I have tried following method for to achieve synchronization.
The app receives a stream of video and audio packets. Each of them have a
timestamp on it.
I have a thread whose sole responsibility is to take the input timestamp
from the render callback and set the master clock depending on this.
In the thread I do the following:
1. Read the "inTimeStamp" value from the buffer shared with the render
callback
2. Use AudioTranslateTime and convert this "inTimeStamp" value to the device
sample time. I presume this is the value in future , right?.Let us assume
that the rendercallback gave a device sample time of 2126.00 Now I assume
that till audio device sample time reaches 2126.00 , 2048 bytes ( depending
on the output stream, this might vary ) will be played.
3. I also have the information for how long the audio packet will play. This
is a multiple of 64 ms. However I also have its size in bytes. I now use
AudioDeviceGetCurrentTime in a loop . Each time in the loop , I get the
delta between the two calls to AudioDeviceGetCurrentTime and multiply this
with 4 . This is the bytes played. When the number of bytes that I have
played becomes equal to the size of the audio packet, the master clock is
set to the next audio packet's time stamp.
// the code would look like this
do
{
.
.
AudioDeviceGetCurrentTime(deviceId, &the_DeviceCurTime);
deltaTime= (the_DeviceCurTime.mSampleTime-oldTime)
//! so now the bytes played is (deltaTime * 4) since the input stream has 4
bytes per sample.
myPacketSize-=(deltaTime*4).
if(myPacketSize<=0) //! I have finished playing the packet.
{
}
..
..
oldTime= the_DeviceCurTime.mSampleTime;
}while(the_DeviceCurTime.mSampleTime < deviceFutureTime)
// after this I get the next value sent by the render callback
.
.
Now to the problem that I face. The video is running ahead of the audio.
The reason is that the master clock is set earlier than required. This
happens because for some reason the "deltaTime" values add up faster to the
audio packet size than required. This is causing the problem.
Basically what I want to know is how far have I played with the bytes I have
supplied to the device.
Can you please give your inputs on the current method?
Thank you.
Regards,
Rahul.
On 6/8/06 12:31 AM, "Jeff Moore" <email@hidden> wrote:
>
> On Jun 7, 2006, at 5:15 AM, Rahul wrote:
>
>> The master clock is set with the time stamp on the audio packet.
>> This is
>> done through regular notifications from the input procedure that
>> supplies
>> data to the Audio Converter (ACComplexInputProc in the
>> PlayAudioFileLite
>> example). We get the time difference ( in AbsoluteTime) between two
>> calls to
>> this input procedure. We consider this time difference as the
>> duration of
>> the sample already played. When this difference adds up to the
>> duration of
>> the input audio packet we set the master clock to the new audio packet
>> timestamp.
>
> This calculation has error in it. The current time when your input
> proc is called is not the time at which the input data was acquired.
> Thus, the difference between that and the succeeding call is only a
> very rough approximation of the duration of the packet. At the very
> least, it contains an enormous amount of jitter due to scheduling
> latency in the IO thread and any variances in timing in the code path
> that leads to your input proc getting called.
>
>> But the same logic ,
>> 1.for a default output device of 44100 sample rate finds the video
>> running
>> behind audio
>> 2. default output device of 32000 sample rate finds the audio
>> running behind
>> video
>>
>> We have also observed that CoreAudio plays an audio packet for a
>> time more
>> than its calculated duration . It looks like it is extrapolating
>> the packet.
>>
>>
>> Any inputs on this? Or is there any other method in CoreAudio ( truly
>> indicating the playing status) which we could use to update our
>> master
>> clock.
>
> Basically, you have a case of garbage in/garbage out. The way you are
> calculating the time stamps is introducing some error into your
> calculation.
>
> There are any number of alternatives. At the HAL level, the IOProc is
> handed the time stamp for the data directly (I'm not exactly sure how
> this time stamp percolates through AUHAL). The HAL also provides
> AudioDeviceGetCurrentTime() and AudioDeviceTranslateTime() to aid in
> tracking the audio device's time base.
-----------------------------------------------
Robosoft Technologies - Come home to Technology
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden