Re: Audio/Video synchronization
Re: Audio/Video synchronization
- Subject: Re: Audio/Video synchronization
- From: Rahul <email@hidden>
- Date: Tue, 13 Jun 2006 21:13:46 +0530
Hello Jeff,
Please find the information about the time stamps
>>The app receives a stream of video and audio packets. Each of them have a
>>timestamp on it.
>You probably ought to go into what this means. It sounds like the media you
>want to play has it's own embedded clock in it, like MPEG does.
The time stamps that I receive are in a 100 nanosec unit. It will be
something like this.
AudioPacket 1: TimeStamp: 67210000
AudioPacket 2: TimeStamp: 70410000
So the duration of the first packet is 320 msec.
There is a start time stamp and then the value of the ensuing timestamps
depend upon the duration of the audio packet which is usually a multiple of
64msec. So the duration of the next packet will be (current+duration)
>>I have a thread whose sole responsibility is to take the input timestamp
>>from the render callback and set the master clock depending on this.
>What is this "master clock"? What is it tracking? What units is it using?
The master clock is set to the time stamp of the audio packet. Each time the
video packet queries this . It returns its value+(time elasped in nanosecs
since it is set).
>>1. Read the "inTimeStamp" value from the buffer shared with the render
>>callback
>I presume you are referring to the input proc of an instance of AUHAL or are
>you directly using the HAL now? I'm a tad confused.
Sorry for the confusion. I was referring to the render callback of the
default audio output unit. Exactly like the "fileRenderProc" in
PlayAudioFileLite example.
>>2. Use AudioTranslateTime and convert this "inTimeStamp" value to the device
>>sample time. I presume this is the value in future , right?
>Until I know where this "inTimeStamp" is coming from, it's hard to say. At any
rate, the time stamps the HAL provides for input data are actually in the past.
But if you mean the time stamp for the render callback of AUHAL, then I believe
that this is a sample time in the future.
Yes, it is the latter case. I am getting the time stamp from the render
callback of AUHAL.
>But, depending on the circumstances, the sample time you get from AUHAL does
not have the same zero point as what the HAL uses for translating time with
AudioDeviceTranslateTime(). Where the HAL derives it's time stamps directly from
the hardware, AUHAL supplies time stamps that are (I think, hopefully Doug will
correct me if I'm wrong about this) zeroed when AUHAL starts playing.
>The net effect is that you have to use
kAudioOutputUnitProperty_StartTimestampsAtZero to turn off AUHAL's remapping of
the time stamps to get values you can pass to AudioDeviceTranslateTime().
I had set this property to false.
>>.Let us assume
>>that the rendercallback gave a device sample time of 2126.00 Now I assume
>>that till audio device sample time reaches 2126.00 , 2048 bytes ( depending
>>on the output stream, this might vary ) will be played.
>I'm not following this statement too well. You can't measure a duration without
two points on the timeline. You mention one, 2126, but not the other that you
are measuring against.
Actually I am not trying to measure the duration. As I mentioned all I need
is the status of the current sample played. So what I could know is that
when the audio device sample time equals 2126.00 it has played 2048 bytes.
Is this right?
>From this and AudioDeviceGetCurrentTime() I was trying to know how much of
my audio packet is played.
>>3. I also have the information for how long the audio packet will play. This
>>is a multiple of 64 ms.
>I presume you mean how long it's nominal duration is. Let's assume that the
data is at a 48K sample rate which makes the 64ms nominally be 3072 samples.
Unfortunately, it is very rare to get an audio device that plays at it's true
nominal rate. The device's true rate can vary by a great deal. As such, you
cannot know, a priori, the amount of time it will take a given device to play
those 3072 samples.
So what would be the alternative here?
>>However I also have its size in bytes. I now use
>>AudioDeviceGetCurrentTime in a loop . Each time in the loop , I get the
>>delta between the two calls to AudioDeviceGetCurrentTime and multiply this
>>with 4 . This is the bytes played. When the number of bytes that I have
>>played becomes equal to the size of the audio packet, the master clock is
>>set to the next audio packet's time stamp.
>That will work, I suppose, but it isn't particularly efficient. If you want to
know when a particular sample is going to be reached by the hardware, it's
better to just use AudioDeviceTranslateTime() and ask for it directly.
Could you please elaborate more on this?
>>Now to the problem that I face. The video is running ahead of the audio.
>>The reason is that the master clock is set earlier than required. This
>>happens because for some reason the "deltaTime" values add up faster to the
>>audio packet size than required. This is causing the problem.
>It sounds to me like you aren't accounting for the true rate of the audio
hardware, like I mentioned above.It's also possible that you still have a
Garbage In/Garbage Out problem due to the discrepancy between how AUHAL tracks
sample time and the HAL tracks the sample time.
>>Basically what I want to know is how far have I played with the bytes I have
>>supplied to the device.
>There is no buffering going on here. There is no added latency in the software.
You can know precisely when a given sample is going to hit the wire using the
HAL's time stamps and the latency figures provided. Data is consumed at a rate
that is expressed through the time stamps.
>>Can you please give your inputs on the current method?
>It seems to me that you are having a great deal of difficulty correlating the
time stamps the system is giving you to your position in the media.
Exactly. This the problem that I have.
>I suspect that is because you have not been formal enough in how you handle
things. You need to be very formal in how you relate the presentation time
stamps in the media to be played to the CPU clock.
Any suggestions or alternative method that you would prescribe?
Regards,
Rahul.
-----------------------------------------------
Robosoft Technologies - Come home to Technology
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden