That's right - that's what noninterleaved means: Each channel is treated separately.
It can be a little confusing, because the AudioBufferList structure defines only a single AudioBuffer within it - you have to either allocate space for the buffer list + the extra buffer, or use a struct or something on the stack.
For example, to prepare an AudioBufferList on the stack, to receive kBufferSize floating-point, stereo, non-interleaved samples:
struct { AudioBufferList bufferList; AudioBuffer secondBuffer; } buffers; buffers.bufferList.mNumberBuffers = 2; for ( int i=0; i<buffers.bufferList.mNumberBuffers; i++ ) { buffers.bufferList.mBuffers[i].mNumberOfChannels = 1; buffers.bufferList.mBuffers[i].mDataByteSize = kBufferSize * sizeof(float); buffers.bufferList.mBuffers[i].mData = malloc(kBufferSize * sizeof(float)); }
--
A Tasty Pixel: App artisans
Latest news: Loopy HD wins 2nd place in the 2011 Best App Ever Awards for Best Musicians App!
On 23 Feb 2012, at 18:06, Gregory Wieber wrote: Thanks Michael,
Sure - I'll post it when I put together a more simple test app.
Curious: were you reading the noninterleaved audio into two different buffers in a bufferlist? That's the thing I can't seem to find any examples of online.
best,
Greg On Thu, Feb 23, 2012 at 2:22 AM, Michael Tyson <email@hidden> wrote: |