Re: AudioBuffer fields in AudioUnitRender
Re: AudioBuffer fields in AudioUnitRender
- Subject: Re: AudioBuffer fields in AudioUnitRender
- From: Doug Wyatt <email@hidden>
- Date: Tue, 8 Mar 2005 09:06:21 -0800
On Mar 8, 2005, at 1:33, Craig Bakalian wrote:
The question here is -> how do I set the fields of the
AudioBuffer to coordinate with the AudioStreamBasicDescription of
the AudioUnit that is getting the renderCallback?
for(int i = 0;i<numberOfChannels;i++)
{
bufferList->mBuffers[i].mNumberOfChannels = ?
the number of channels according to the input's stream format
bufferList->mBuffers[i].mDataByteSize = framesPerSlice * sizeof
( float, int, unsigned ?)
framesPerSlice is not constant, especially when you've got a sample
rate converter in your chain. The number of frames you're being asked
to render is passed to AudioUnitRender, inNumberFrames. That should
be multiplied by the input stream format's mBytesPerFrame.
bufferList->mBuffers[i].mData = NULL; // I know this must be
set to NULL
not true, if the input is coming from a callback function instead of
a connection to another AU, this must be non-null.
AUBase delegates PullInput to AUInputElement. Notice that
AUInputElement has mIOBuffer, and takes care of allocating memory for
its buffers when the input is connected to a callback function as
opposed to being disconnected or connected to another AU. Notice that
AUInputElement::PullInput (via mIOBuffer.PrepareBuffer or
PrepareNullBuffer) does what you're trying to do here. Why not just
call it?
Doug
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden