I want my OS X application to be able to use more than two channels to output audio. Example: I got an audiodevice with 10 output channels and want to output a stereo signal over let's say 4 of them. +----+stereoSignal stereoSignal | 2/3|--- --------| |stereoSignal | 0/1|--- +----+ But sadly the render callback only provides me with an AudioBufferList containing 2 buffers (mNumberBuffers = 2)
When I initialize the output device, I use a AUGraph containing a MultiChannelMixer- and the DefaultOutput-Unit.
I create a channel map looking like this:
[0, 1, 2, 3, -1, -1, -1, -1, -1, -1] which I apply to the output scope of the output unit.
I also set the StreamFormat property of the mixer unit with a ASDB which contains mChannelsPerFrame = 4.
The audio buffers are non interlaved audio.
These settings let me expect that my callback contains a AudioBufferList containing four buffers which I have to fill.
What did i forget? |