Crude video synchronization
Crude video synchronization
- Subject: Crude video synchronization
- From: Alex Weiss <email@hidden>
- Date: Sat, 21 May 2011 12:03:53 +0200
Hello,
I'm trying to implement some crude audio-video synchronization in my application. I say crude because it doesn't need to be microsecond-accurate -- the picture won't be married to the audio, it'll just play alongside a software synth so that the user can perform in sync to the picture. Unfortunately, I have little to no experience in AV synching, so I'm not sure my approach would work. Basically, I'm thinking I could just group samples together into video frames (e.g. at 48kHz and 25fps, there would be 1920 samples of audio in a frame of video) and count them during playback; whenever the next block starts, I'd simply display the next frame.
However, something tells me this might be __too__ crude an implementation. Additionally, things might get hairy with fractional framerates, such as 29.97 and 23.976 -- I'm worried that rounding errors might accumulate.
Any ideas?
Thanks,
Alex
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden