Re: conversion needed for syncing audio and video
Re: conversion needed for syncing audio and video
- Subject: Re: conversion needed for syncing audio and video
- From: "Chase" <email@hidden>
- Date: Thu, 10 May 2007 21:29:49 +0000
On May 10, 2007, Jeff Moore wrote:
> If your time stamp doesn't already have the host time field populated,
> you need to convert the sample time to host time using
> AudioDeviceTranslateTime(). From there, you can put the host time in a
> CVTimeStamp and use CVDisplayLinkTranslateTime to get to whichever
> video time base you want that the display link supports.
>
Thank you. That worked. The conversion looks correct and I can successfully pass the converted CVTimeStamp on to the core video renderer.
But for some reason my video and audio still drift apart after a minute or so.
I can't figure out how this could be happening. I mean the time stamp is generated from the running audio, so they should naturally be in sync. I can see how they might be offset by a couple frames maybe, but not drifting more and more and more out of sync as the movie progresses. They get WAY out of sync after 3 or 4 minutes (couple of seconds out of sync).
Also, if i pause the video and let the audio advance for a while and then un-pause the video again, it starts right where i left off (when i originally hit pause) instead of jumping right to the place where audio is currently at.
Here's the code:
static CVReturn renderCallback( CVDisplayLinkRef displayLink,
const CVTimeStamp * inNow,
const CVTimeStamp * inOutputTime,
CVOptionFlags flagsIn,
CVOptionFlags * flagsOut,
void * displayLinkContext )
{
static CVTimeStamp ts_audio;
bzero(&ts_audio, sizeof(ts_audio));
ts_audio.hostTime = [[(VideoView*)displayLinkContext sound ] currentHostTime ];
ts_audio.flags = kCVTimeStampHostTimeValid;
static CVTimeStamp ts_video;
bzero(&ts_video, sizeof(ts_video));
ts_video.flags = kCVTimeStampVideoTimeValid | kCVTimeStampHostTimeValid;
CVDisplayLinkTranslateTime(displayLink, &ts_audio, &ts_video);
return [(VideoView*)displayLinkContext renderTime:&ts_video ];
}
- (UInt64)currentHostTime
{
static AudioTimeStamp ts;
bzero(&ts, sizeof(ts));
ts.mFlags = kAudioTimeStampSampleTimeValid;
_videoSyncDataSize = sizeof(ts);
AudioUnitGetProperty(_audioUnit, kAudioUnitProperty_CurrentPlayTime, kAudioUnitScope_Global, 0, &ts, &_videoSyncDataSize);
AudioDeviceID theoutdevice;
UInt32 theSize = sizeof(AudioDeviceID);
AudioHardwareGetProperty ( kAudioHardwarePropertyDefaultOutputDevice, &theSize, &theoutdevice );
bzero(&_videoSyncTimeStamp, sizeof(_videoSyncTimeStamp));
_videoSyncTimeStamp.mFlags = kAudioTimeStampHostTimeValid;
AudioDeviceTranslateTime(theoutdevice, &ts, &_videoSyncTimeStamp);
return _videoSyncTimeStamp.mHostTime;
}
Thanks.
- Chase
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden