Re: Where should CAStreamBasicDescription be instantiated?
Re: Where should CAStreamBasicDescription be instantiated?
- Subject: Re: Where should CAStreamBasicDescription be instantiated?
- From: Jeff Moore <email@hidden>
- Date: Wed, 27 Feb 2013 14:01:11 -0800
You don't say what plug-in API you are writing to. Knowing that will help answer your actual question about controlling the buffer size for whatever it is you are doing.
That said, I can't think of a single instance where you would set the buffer size of anything using an AudioStreamBasicDescription structure. That structure is there to describe the format of the audio data as opposed to how the data is moved around.
--
Jeff Moore
Core Audio
Apple
On Feb 27, 2013, at 1:24 PM, Jim Griffin <email@hidden> wrote:
> I am working on a plugin that needs more data frames from the audio stream than the default 512. If I could get 2048 instead things would work better for me.
>
> It looks to me that using CAStreamBasicDescription to configure the streams to send me a 2048 frame would be a solution to my problem. I read through the docs and they seem pretty clear except where do I put the CAStreamBasicDescription in my code?
>
> I have found where the default configuration is called and where it sets the parameters the way it wants but it seems to basically come out of nowhere. I haven't been able to determine where it is instantiated at or what I need to do to over ride it.
>
> My first problem is where in my plugin do I instantiate a CAStreamBasicDescription object so that it will control the stream configuration and not be over written by the default configuration
>
> Are there any example projects that illustrate how to use the CAStreamBasicDescription is used directly?
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden