Re: Audio/Video synchronization
Re: Audio/Video synchronization
- Subject: Re: Audio/Video synchronization
- From: Rahul <email@hidden>
- Date: Thu, 15 Jun 2006 21:06:20 +0530
Hello Jeff,
Thank you for the detailed reply.
>Sorry this took so long. At any rate, the back-and-forthing of this thread have
gotten a bit long and hard to follow. So here's my summary of my understanding
of your problem (numbered for easy reference):
>1) You are trying to play media containing audio and video in synch, but so far
haven't had too much success
>2) The media has an embedded clock in it, similar to MPEG (the actual rate of
that clock isn't really that important)
>3) The audio packets in the media have a start time that is calculated from an
anchor point plus the sum of the durations of the preceding packets, implying
that the audio data is to be played contiguously.
>4) Your code has a "master" clock that wants to track the number of samples
played. The actual values that this clock provides to it's callers are in
nanoseconds.
>5) Your code uses AUHAL as it's interface to the audio hardware and has set
things up so that the time stamps in the render callbacks are the raw time
stamps from the HAL.
>6) When you look at why things aren't in synch with these methods, it appears
that the audio data playing back too fast.
>I think that covers it.
Precisely. It pretty much summarizes my problem.
>I'm concerned about your master clock. How are you converting from the samples
you count into nanoseconds?
> I think this is the crux of your problem. This is what I was talking about
when I mentioned accounting for the true playback rate of the audio hardware.
>From the sound of it, the audio hardware is running a tad slow (not an uncommon
thing) and you aren't adjusting this calculation accordingly.
> of your media.
<mail clipped>
>One other thing to note is that it doesn't sound like you are doing anything
with the presentation latencies of the two devices involved. Audio and video
devices tend to be a lot different in this regards. To properly synch the media
right from the start, you need to account for the differences when you schedule
the audio and video to be played. The HAL provides the device's latency via the
property, kAudioDevicePropertyLatency.
I think I have made some progress with your inputs. Hopefully I will get it
done soon. The current strategy is something like this:
1. I am holding the output unit to its promise. Let's say it gives a time
stamp saying it will play 2048 bytes when the device sample time reaches
2126.0.
2. So now I will wait till the sample time of the device reaches 2126.0 from
its current value. When it reaches 2126.0 , I am sure that it has kept its
promise and played exactly 2048 bytes.
3. I now map this value into the samples of input packet that I have given
it. Find the one it is playing and set the master clock to the time stamp of
this.
There is however one more thing that is bothering me. The data that I
supply to the output audio unit is from a buffer. Now there is a requirement
wherein I have to stop and dump everything that the audio unit is currently
playing and going to play. For this , I have dumped the data from the source
buffer and I have tried using AudioOutputUnitStop followed by AudioUnitReset
asking the audio unit to do the same.
But after starting the output unit again ( after some 3 sec), I find that
it has not dumped the samples which it was supposed to play before
performing stop. Now these samples are played after I do start.
Is there a way I can tell the audio unit to dump everything that it is
currently holding?
Thanks again.
Regards,
Rahul.
-----------------------------------------------
Robosoft Technologies - Come home to Technology
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden