Re: Interleaving confusion...
Re: Interleaving confusion...
- Subject: Re: Interleaving confusion...
- From: Doug Wyatt <email@hidden>
- Date: Thu, 3 Jun 2004 12:03:35 -0700
On Jun 2, 2004, at 17:00, Ethan Funk wrote:
When rendering to the HALOutput Audio unit, via an AUGraph, I always
get two buffers passed to my RenderCallback function, one for each
channel. The stream format for the output unit reports
mChannelsPerFrame equal to 2. It was my understanding that this
indicates that the output AU wanted two interleaved channels in a
single buffer.
mChannelsPerFrame = 2 means that there are two channels of audio. When
describing a device's streams (via the AudioHardware / AudioDevice /
etc. APIs), this does mean that the channels are always interleaved.
In the world of Audio Units, (mFormatFlags &
kAudioFormatFlagIsNonInterleaved) indicates non-interleaved data.
Float32 non-interleaved is the "canonical" format for Audio Units and
the only format you can expect an arbitrary unit to support. Some
units, however, like the Apple output units and AUConverter, support
interleaved streams. In your case, you just need to specify an
interleaved stream description as the unit's input.
Why then, do I get two buffer passed into my render
callback? Fill only one buffer results in only one channel of audio
output from the hardware.
All the obvious stuff... setting up and running the AUGraph ,
configuring the device for the HALOutput, etc. seems to be working. I
am assigning the Built-in output device to the HALOutput AU.
I _WANT_ to pass the output AU interleaved data, not two discrete
buffers. What am I missing here?
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.