Re: Crude video synchronization
Re: Crude video synchronization
- Subject: Re: Crude video synchronization
- From: Mo DeJong <email@hidden>
- Date: Sat, 21 May 2011 12:15:06 -0700
Alex
You will want to convert sample counts to a wall clock time and then
use another mapping to convert wall clock time to a frame offset based
on whatever the video frame rate is. If you want to examine at a
working iOS example that implements the time to video frame logic,
have a look at the PNG Animator example here:
http://www.modejong.com/iOS/#ex2
cheers
Mo
On Sat, May 21, 2011 at 3:03 AM, Alex Weiss <email@hidden> wrote:
> Hello,
> I'm trying to implement some crude audio-video synchronization in my
> application. I say crude because it doesn't need to be microsecond-accurate
> -- the picture won't be married to the audio, it'll just play alongside a
> software synth so that the user can perform in sync to the picture.
> Unfortunately, I have little to no experience in AV synching, so I'm not
> sure my approach would work. Basically, I'm thinking I could just group
> samples together into video frames (e.g. at 48kHz and 25fps, there would be
> 1920 samples of audio in a frame of video) and count them during playback;
> whenever the next block starts, I'd simply display the next frame.
> However, something tells me this might be __too__ crude an implementation.
> Additionally, things might get hairy with fractional framerates, such as
> 29.97 and 23.976 -- I'm worried that rounding errors might accumulate.
> Any ideas?
> Thanks,
> Alex
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden