Re: Getting exact playout time of audio buffers in GarageBand
Re: Getting exact playout time of audio buffers in GarageBand
- Subject: Re: Getting exact playout time of audio buffers in GarageBand
- From: William Stewart <email@hidden>
- Date: Thu, 7 Jul 2005 15:03:47 -0700
And to add to what both Jeff and Chris have described.
In Tiger we added a new property:
kAudioUnitProperty_PresentationLatency
As this is a new property, I doubt that any host is supporting this
at this time, but it is designed to deal with the exact problem that
you are having.
The following are the comments as they appear in <AudioUnit/
AudioUnitProperties.h>
kAudioUnitProperty_PresentationLatency (Input/
Output Scope) Float64 (write only)
This property is set by a host to describe to the AU the
presentation latency of both
any of its input and/or output audio data.
It describes this latency in seconds. A value of zero means
either no latency
or an unknown latency.
This is a write only property because the host is telling
the AU the latency of both the data it provides
it for input and the latency from getting the data from the
AU until it is presented.
The property is should be set on each active input and
output bus (Scope/Element pair). For example, an
AU with multiple outputs will have the output data it
produces processed by different AU's, etc before it
is mixed and presented. Thus, in this case, each output
element could have a different presentation latency.
This should not be confused with the Latency property, where
the AU describes to the host any processing latency
it introduces between its input and its output.
For input:
Describes how long ago the audio given to an AU was
acquired. For instance, when reading from
a file to the first AU, then its input presentation
latency will be zero. When processing audio input from a
device, then this initial input latency will be the
presentation latency of the device itself
- , the device's safety offset and latency.
The next AU's (connected to that first AU) input
presentation latency will be the input presentation latency
of the first AU, plus the processing latency (as
expressed by kAudioUnitProperty_Latency) of the first AU.
For output:
Describes how long before the output audio of an AU is
to be presented. For instance, when writing
to a file, then the last AU's output presentation
latency will be zero. When the audio from that AU
is to be played to an AudioDevice, then that initial
presentation latency will be the latency of
the device itself - which is the I/O buffer size, and
the device's safety offset and latency
The previous AU's (connected to this last AU) output
presenation latency will be that initial presentation
latency plus the processing latency (as expressed by
kAudioUnitProperty_Latency) of the last AU.
Thus, for a given AU anywhere within a mixing graph, the
input and output presentation latencies describe
to that AU how long from the moment of generation it will
take for its input to arrive, and how long
it will take for its output to be presented.
An example usage for this property is for an AU to provide
accurate metering for its output,
where it is generating output that will be presented at some
future time (as presented by the output
presentation latency time) to the user.
Bill
<snipped previous posts>
--
mailto:email@hidden
tel: +1 408 974 4056
________________________________________________________________________
__
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________
__
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden