Re: Audiounit I/O Buffer Size
Re: Audiounit I/O Buffer Size
- Subject: Re: Audiounit I/O Buffer Size
- From: Brian Willoughby <email@hidden>
- Date: Thu, 31 Jan 2008 08:19:52 -0800
Does Logic ask the driver to change the hardware I/O buffer size?
What is another audio program is running at the same time which
requests a larger hardware buffer size during Logic record mode? I
don't think Logic sets hog mode (I could be wrong), so there is no
guarantee that Logic is using the same buffer size as the driver and
actual hardware.
In other words, I have been trying all along to establish that there
is a difference between the buffer size used within an application
(e.g. an AUGraph) for talking between AudioUnits and their host, and
the actual hardware I/O buffer size set by the driver. Audio devices
are shared, and it is very easy for more than one application to have
different requirements. You might get speedy turnaround (low
latency) within Logic, but still not hear the results on the outputs
without additional latency. i.e., your internal processing buffer is
not the one used by the driver.
Sorry if I am sounding like a broken record. I don't seem to be
making myself clear.
Brian Willoughby
Sound Consulting
On Jan 31, 2008, at 03:19, Jankoen de Haan wrote:
Hi Brian
I understand what you are saying, but you are overlooking one thing.
Logic does change its internal processing buffer size to an
appropriate size according to the function of a track : Disk track or
"live". When a track becomes "live" in Logic the latency should be
the smallest possible (IO buffer size is the smallest buffer size the
makes sense). Logic changes this in realtime. So, when you press
record on a track, it becomes live, and gets the lowest latency
possible and the internal processing buffer sizes are adapted
accordingly.
So when running as a AU in Logic you get to deal with several
different signal flows each which its own latency and buffer sizes.
JK
I fully understand the importance of buffer size, because I have
written convolution plugins myself.
However, the point I am trying to make is that there seems to be no
guaranteed correlation between the hardware I/O buffer size set by
the driver and the buffer size in the AUGraph used by the DAW.
Since many programs can run at the same time sharing the hardware
through the driver, not every program can have full control over
the hardware I/O buffer size. In my estimation, if the DAW uses a
large buffer when communicating with its AudioUnits, then even a
small hardware buffer would offer no improvement for this
application. Likewise, if the application is using a very small
buffer for AudioUnits, but the hardware is set to a large I/O
buffer by some other application, then you will again see no benefit.
i.e. It only seems useful for the AudioUnit to be able to request
the buffer size within the DAW. It is up to the user and the
system to set the hardware I/O buffer size appropriately, to
balance the tradeoff between safety and reliability versus low
latency. About the only reason I could see for an AudioUnit to
reach all the way to the hardware to query the buffer size is if
you were planning on presenting a message to the user to alert them
that their system is set up wrong because there is a mismatch
between the DAW and the hardware buffer sizes.
If I am missing something in the way I am viewing things, please
explain. I could certainly be overlooking something.
Brian Willoughby
On Jan 30, 2008, at 00:13, Jankoen de Haan wrote:
The IO buffer size is important since that the smallest possible
latency when going in -> out of your DAW. Since musicians nowadays
use their DAW to process there sounds with FX it is really
important to have as much latency as possible. Via a special
technic it is possible to do zero latency convolution, but then you
need to know the smallest possible latency in the whole system to
do the work highly optimized.
This raises an important question (in my mind, at least): If the
user sets a particular buffer size in a DAW such as Logic, does it
really matter what the hardware buffer size is? Wouldn't having
too large of a buffer in Logic defeat any advantage you might have
in your AU by using a smaller buffer based on the hardware? In
other words, I am thinking that all you need to know is the buffer
size that the DAW will use for your AU. Anything beyond that
cannot be taken advantage of, unless I am missing some detail...
Brian Willoughby
Sound Consulting
On Jan 29, 2008, at 05:39, Jankoen de Haan wrote:
I wish I could get hold of the Hardware buffer size from within a
audio unit. I know I should not be dependen on this and that I
should use de GetMaxFramesPerSlice to setup my buffers.
But.... I would really like this value so I can minimize the
latency of my plugin. Since I use convolution in my audiounit, I
always have to optimize between latency en CPU heaviness.
I would like to implement some buffer size adapting internally to
the plugin (like I did for RTAS) to make sure that the plugin
performs in the best way I can imagine. So when the track of the
audiounit switches to be "live" it performs with lowest latency
that makes sense. The only value I really would to get a hold of
is the I/O buffer size as set in the preferences panel of Logic.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden