• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Crude video synchronization
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Crude video synchronization


  • Subject: Re: Crude video synchronization
  • From: Ross Bencina <email@hidden>
  • Date: Sun, 22 May 2011 10:37:16 +1000

Alex Weiss wrote:
You will want to convert sample counts to a wall clock time and then
use another mapping to convert wall clock time to a frame offset based
on whatever the video frame rate is.

CoreAudio can tell you exactly when (in wall clock time) the output samples will reach the analog output without you needing to perform the conversion yourself.


I'm not exactly sure it works the same way with RemoteIO, but with AUHal you get AudioTimeStamp::mHostTime in your IOProc, and then you can offset this time by the known latency to get the time the first sample of the output buffer will hit the DAC.

With AUHal the offset can be retrieved using AudioDeviceGetProperty using the kAudioDevicePropertyLatency. This latency is reported in samples, so you need to multiply it by the sample rate to get a time offset. If you want it to be really accurate you would need to multiply by the actual (not nominal) sample rate -- AUHal has kAudioDevicePropertyActualSampleRate for this purpose but I don't think RemoteIO does.

so you have:
outputTime = AudioTimeStamp::mHostTime + deviceLatency * sampleRate;

Once you have this conversion to wall time you can convert it to frame-time and from there trigger your video playback (somehow). Perhaps you run a separate animation timer that compares AudioGetCurrentHostTime() against when the next frame should play. Lots of options I think.

I thought about this problem a while ago with respect to the PortAudio timing API, i don't think the methods are state of the art, but the analysis is probably relevant if you're just starting to think about this:
http://www.portaudio.com/docs/portaudio_sync_acmc2003.pdf


HTH

Ross.


If you want to examine at a
working iOS example that implements the time to video frame logic,
have a look at the PNG Animator example here:

http://www.modejong.com/iOS/#ex2

cheers
Mo

On Sat, May 21, 2011 at 3:03 AM, Alex Weiss <email@hidden> wrote:
Hello,
I'm trying to implement some crude audio-video synchronization in my
application. I say crude because it doesn't need to be microsecond-accurate
-- the picture won't be married to the audio, it'll just play alongside a
software synth so that the user can perform in sync to the picture.
Unfortunately, I have little to no experience in AV synching, so I'm not
sure my approach would work. Basically, I'm thinking I could just group
samples together into video frames (e.g. at 48kHz and 25fps, there would be
1920 samples of audio in a frame of video) and count them during playback;
whenever the next block starts, I'd simply display the next frame.
However, something tells me this might be __too__ crude an implementation.
Additionally, things might get hairy with fractional framerates, such as
29.97 and 23.976 -- I'm worried that rounding errors might accumulate.
Any ideas?
Thanks,
Alex
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden 

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >Crude video synchronization (From: Alex Weiss <email@hidden>)
 >Re: Crude video synchronization (From: Mo DeJong <email@hidden>)

  • Prev by Date: Re: Crude video synchronization
  • Next by Date: CARingBuffer implementation question
  • Previous by thread: Re: Crude video synchronization
  • Next by thread: A strange link warning
  • Index(es):
    • Date
    • Thread