Deriving timing information for scheduled playback on an AUGraph
Deriving timing information for scheduled playback on an AUGraph
- Subject: Deriving timing information for scheduled playback on an AUGraph
- From: Kyle Sluder <email@hidden>
- Date: Wed, 16 Jun 2010 23:28:44 -0700
Hello,
First I'd like to thank the engineer who helped me out at the audio
lab at WWDC last week, on top of all those who gave the really
informative sessions. I cannot recall to whom I spoke at the lab, but
you put me on the path of using Audio Units rather than Audio Queues
to get the results I need. But I still have some lingering questions.
I am building a radio automation application; it plays back audio
tracks according to a schedule. It's slightly more complicated than
your typical iTunes playlist in that the schedule can be flexible at
times; some of the actual audio tracks chosen for playback might not
be known until runtime, there might be gaps in the schedule whose
duration is controlled by the operator, and most, but not all, audio
events are scheduled relative to each other, not to the wall clock.
My first question is, what timing source should I use to determine the
wall-clock time? NSTimer would be insufficient for this task. Will
adding a render notification callback to my output unit be sufficient
to derive the canonical time for my entire application, or should I
use an independent timer mechanism to determine the current wall
clock, and restrict sample timestamps to correctly filling buffers?
My second question is, does Core Audio perform any clock drift
compensation when multiple output units are used in a graph? The user
might schedule some tracks to play back on a different hardware
device; even if both devices are nominally running at the same clock
speed, I'm worried that Core Audio can only say "I've fed 10,000
samples to the hardware so far, and it claims to be running at
44.1kHz, so the wall clock time for the next samples I ask for must be
10000/441000 seconds since I started running." What if the same
generator unit is being used to provide audio for two out-of-sync
devices?
If Core Audio can't do any compensation, what does this mean for
determining the current wall clock time for my application? Might,
over a long period of time, a slow- or fast-running hardware device
cause its timestamps to become out of sync with reality? I also need
to look into running different output units at different clock speeds,
but I suppose I'm expected to insert a varispeed unit into the graph
to compensate.
Thanks for the help all,
--Kyle Sluder
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden