Re: More on: Error in AudioUnitRender()
Re: More on: Error in AudioUnitRender()
- Subject: Re: More on: Error in AudioUnitRender()
- From: philippe wicker <email@hidden>
- Date: Sat, 10 Dec 2005 11:03:10 +0100
On Dec 10, 2005, at 6:11 AM, john wrote:
Hi Doug,
I think I realised the paramErr I'm getting - I'd like the audio
data in interleaved format and am preparing the AudioBufferList for
the call to AudioUnitRender accordingly. If I call
AudioUnitSetProperty with kAudioUnitPropertyStreamFormat and
kAudioUnitScope_Output (or _Global) I'm still getting the error
when calling AudioUnitRender.
Can this be done, or do I have to be able to support 2 non-
interleaved channels?
The native audio format for current (V2) Audio Unit (V1 is
deprecated) is NON interleaved float 32. Interleaved channels are
used when dealing with audio devices, or when reading/writing a file
to a disk, ....
Thanks.
-- John
You can interrogate kAudioUnitProperty_MaximumFramesPerSlice and
respect it -- or you can set it:
UInt32 frames = 2048;
AudioUnitSetProperty(myReverbAUInstance,
kAudioUnitProperty_MaximumFramesPerSlice,
kAudioUnitScope_Global,
0,
&frames,
sizeof(frames));
Doug
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden