Re: Audio Unit/AudioConverter and getting what's playing *now*
Re: Audio Unit/AudioConverter and getting what's playing *now*
- Subject: Re: Audio Unit/AudioConverter and getting what's playing *now*
- From: Doug Wyatt <email@hidden>
- Date: Fri, 22 Oct 2004 09:41:24 -0700
On Oct 21, 2004, at 23:23, Simon Fraser wrote:
My problem is that this sound has to sync to video. There's a delay
between calling AudioOutputUnitStart() and the sound coming out, I
think
because of some thread starvation issues that mean that my
ACComplexInputProc can't always supply as many packets as the converter
wants (which may happen because of network bandwidth issues in the
normal course of events). So I need to be able to ask the audio device
(or some other component) the timestamp of the samples that are
currently being played.
I've tried getting the audio device for the AU with
AudioUnitGetProperty(myUnit, kAudioOutputUnitProperty_CurrentDevice,
kAudioUnitScope_Output...), and then calling
AudioDeviceGetCurrentTime(), but this just returns the number of
samples
processed since I called AudioOutputUnitStart() (ie. it continues to
increase even during the initial period of silence).
AudioDeviceTranslateTime() didn't seem to help me either.
AudioDeviceGetCurrentTime() does return the current hardware sample
number. This is actually an arbitrary number; it will start at
something other than 0 in many situations (e.g. if another process was
already playing sound when yours started). This time also has nothing
to do with the current buffer you're playing; it is the HAL's
computation of what sample number is currently hitting the hardware.
AudioDeviceTranslateTime() just converts between those driver absolute
sample numbers and host time (and other units if the driver supports
them, though I don't know of any drivers that do, yet).
You can also look at the stream of timestamps coming from the output
unit in its calls to get input from another AU or an input callback.
These timestamps do start at 0 when you start the output unit. If there
is a sample rate conversion in the output unit (e.g. hardware is at
44100 and client is at 22050), then the timestamps are transformed to
run at the client sample rate.
To do anything more with these timestamps requires some notion of where
you are in your timeline -- you'll need an idea of what hardware
reference time is the beginning of the timeline, and make your
calculations relative to that start time.
You should start the audio hardware running before you do anything
related to sync; in some situations it can take a noticeable amount of
time for the hardware to start. One way to do this is simply to start
the output unit and feed it silence before your timeline begins.
Doug
--
Doug Wyatt
Core Audio, Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden