Re: Audiounit I/O Buffer Size
Re: Audiounit I/O Buffer Size
- Subject: Re: Audiounit I/O Buffer Size
- From: Brian Willoughby <email@hidden>
- Date: Thu, 31 Jan 2008 01:38:46 -0800
I fully understand the importance of buffer size, because I have
written convolution plugins myself.
However, the point I am trying to make is that there seems to be no
guaranteed correlation between the hardware I/O buffer size set by
the driver and the buffer size in the AUGraph used by the DAW. Since
many programs can run at the same time sharing the hardware through
the driver, not every program can have full control over the hardware
I/O buffer size. In my estimation, if the DAW uses a large buffer
when communicating with its AudioUnits, then even a small hardware
buffer would offer no improvement for this application. Likewise, if
the application is using a very small buffer for AudioUnits, but the
hardware is set to a large I/O buffer by some other application, then
you will again see no benefit.
i.e. It only seems useful for the AudioUnit to be able to request the
buffer size within the DAW. It is up to the user and the system to
set the hardware I/O buffer size appropriately, to balance the
tradeoff between safety and reliability versus low latency. About
the only reason I could see for an AudioUnit to reach all the way to
the hardware to query the buffer size is if you were planning on
presenting a message to the user to alert them that their system is
set up wrong because there is a mismatch between the DAW and the
hardware buffer sizes.
If I am missing something in the way I am viewing things, please
explain. I could certainly be overlooking something.
Brian Willoughby
On Jan 30, 2008, at 00:13, Jankoen de Haan wrote:
Hi Brian
The IO buffer size is important since that the smallest possible
latency when going in -> out of your DAW. Since musicians nowadays
use their DAW to process there sounds with FX it is really important
to have as much latency as possible. Via a special technic it is
possible to do zero latency convolution, but then you need to know
the smallest possible latency in the whole system to do the work
highly optimized.
Jankoen de Haan
Audio Ease BV
This raises an important question (in my mind, at least): If the
user sets a particular buffer size in a DAW such as Logic, does it
really matter what the hardware buffer size is? Wouldn't having
too large of a buffer in Logic defeat any advantage you might have
in your AU by using a smaller buffer based on the hardware? In
other words, I am thinking that all you need to know is the buffer
size that the DAW will use for your AU. Anything beyond that
cannot be taken advantage of, unless I am missing some detail...
Brian Willoughby
Sound Consulting
On Jan 29, 2008, at 05:39, Jankoen de Haan wrote:
I wish I could get hold of the Hardware buffer size from within a
audio unit. I know I should not be dependen on this and that I
should use de GetMaxFramesPerSlice to setup my buffers.
But.... I would really like this value so I can minimize the
latency of my plugin. Since I use convolution in my audiounit, I
always have to optimize between latency en CPU heaviness.
I would like to implement some buffer size adapting internally to
the plugin (like I did for RTAS) to make sure that the plugin
performs in the best way I can imagine. So when the track of the
audiounit switches to be "live" it performs with lowest latency
that makes sense. The only value I really would to get a hold of is
the I/O buffer size as set in the preferences panel of Logic.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden