RE: AudioUnitV3 effect and "maximumFramesToRender"
RE: AudioUnitV3 effect and "maximumFramesToRender"
- Subject: RE: AudioUnitV3 effect and "maximumFramesToRender"
- From: Jonatan Liljedahl <email@hidden>
- Date: Mon, 17 Dec 2018 13:14:03 +0100
Hi,
If you read the documentation for maximumFramesToRender you'll see:
"This must be set by the *host* before render resources are
allocated." (my emphasis).
This property is set by the host, to let the plugin know how many
frames *maximum* it will ask it to render, nothing else. This is to
allow the plugin to allocate internal buffers etc at suitable sizes in
its allocateRenderResources method.
The reason it's set also is the init method of the plugin itself is
probably simply to provide a default value in case the host does not
set it.
Then, the actual number of frames that the host renders from the
plugin could differ from call to call, and should be less than or
equal to the value of maximumFramesToRender as it were when your
allocateRenderResources was called. If not, it means the host is buggy
and is rendering more frames than what it said should be the maximum,
and it would likely make a lot of plugins crash.
All of this is only related to host-plugin relationship, and has
actually nothing to do directly with the core audio driver/hardware
buffer size. Though, many hosts (including my own host app, AUM) will
render exactly the same number of frames as the current hardware
buffer size. I've never looked at AVAudioEngine, but my guess is that
it also does so.
To control the hardware buffer size, you may ask the AVAudioSession
for a preferred buffer duration, but that's by no means a guarantee
that you'll actually get that size.
To summarise, there's two things a plugin developer need to do
regarding buffer size:
1) read the value of maximumFramesToRender in your
allocateRenderResources in case you need to allocate internal buffers.
2) Only rely on and adapt to the actual number of frames asked for, as
passed to your render block/callback when called.
Cheers
/Jonatan Liljedahl - http://kymatica.com
> > On 14 Dec 2018, at 17:48, Waverly Edwards <email@hidden> wrote:
> >
> > init(componentDescription: AudioComponentDescription,options:
> > AudioComponentInstantiationOptions = [])
> >
> >
> >
> > I created an AudioUnitV3 effect and I set the variable
> > "maximumFramesToRender" within the above method. The effect does work but
> > the system overwrites this value with 512 frames, leaving me with no way to
> > alter how many frames are pulled.
> >
> >
> >
> >
> > https://developer.apple.com/documentation/audiotoolbox/auaudiounit/1387654-maximumframestorender
> >
> >
> >
> > The documentation states you must be set before resources are allocated
> > and I do so in my init. What else must I do in order to change how many
> > frames are pulled in a single cycle.
> >
> >
> >
> > Does anyone have advice on this matter.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden