• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Dealing with Channel Order in my Effect AudioUnits
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Dealing with Channel Order in my Effect AudioUnits


  • Subject: Re: Dealing with Channel Order in my Effect AudioUnits
  • From: William Stewart <email@hidden>
  • Date: Tue, 19 Aug 2008 11:10:14 -0700


On Aug 19, 2008, at 6:59 AM, Motti Shneor wrote:

Hello all.

I'm new to AudioUnits, so please pardon my first introductory questions.

I'm enhancing a set of AudioUnit Effect plugins to support Multichannel-Surround processing within the Logic Audio Host.

Our AudioUnits are implemented so that a single channel- configuration is supported by one component. For the same processing core, we create different 'thng's for

Mono->Mono
Mono->Stereo
Mono->5.1
Stereo->5.1
Mono->5.1
5.1->5.1
etc.

I personally think this is a bad idea and is one of the common complaints we get about Waves AUs. We would rather that you had a single AU was able to deal with a complete set of channel configs that were given to it. Most reasonable use cases can be understood and supported by your AU as other companies do.


All Our AudioUnits derive from the AUMIDIEffectBase SDK class, but overrides several AUBase methods as well.
The Surround-processing DSP code within out audio units NEEDS TO KNOW the channel order and labels, meaning, LFE is processed differently than Ls, and there are inter-channel influences which take into account the specific channel label.


My first question is general.

1. In the CoreAudio system, should an effect unit publish its preferred input and output channel order, or should it cope with arbitrary channel order in the input and output ?

So, in your above AU layout, your AU would publish the following channel configurations
{ 1, 1} { 1, 2 } { 1, 6 } { 2, 6 } { 6, 6 }


I think you are probably missing { 2, 2 }

I would also encourage you to do a 5.0 version as it is quite common to *not* deal with the LFE channel in the same channel processing strips as the rest. So you would add:

{ 1, 5 } { 2, 5 } { 5, 5 }

Now - depending on the answer to that,

2 If I can "demand" a specific channel order (or one of several supported channel orders), How do I specify it? And is it the role of the host to provide me the right channel order?


3 If I need to cope with arbitrary channel order in the input an output,
3.1 How do I inspect the channel order in my input and output elements? I've seen several interchanging specification ways -
via ChannelBitmap, via AudioChannelLayoutTag Constants such as kAudioChannelLayoutTag_Binaural, kAudioChannelLayoutTag_MPEG_5_0_C kAudioChannelLayoutTag_ITU_1_0 etc. and via the channel labels that should be in the Element Info structures.

In order for your AU to understand each channel's role, you need to support the ChannelLayout and ChannelLayoutTag properties - see AUBase for the implementation of this. It also doesn't matter which order that you get the channels in - as long as you know which channel is what - and the host can deal with the channels getting to the right speakers. So, you should only publish one channel layout tag per number of channels and we've defined what we would like to see broadly used in <CoreAudio/CoreAudioTypes.h> (look for audio unit channel layout tags in this file) - (I think you've already found this!)


If you just use the tags as described above, then you should only have to deal with one channel order. However different AU developers have different ideas of how this should be implemented, so in the real world, there is a broader propagation of channel orders than we would have preferred to have seen. The ACL is, as you describe, a very flexible structure but it is also self-describing.

There are properties in the <AudioToolbox/AudioFormat.h> that can give you back a fully specified ACL where the channel descriptions are used - you could use this to at least give you one shape of an ACL to deal with in the rest of your code.

3.2 Can the channel order change between calls for processing, or is it set once when initializing the AU (in other words, do I need to map input/output channels every time I'm called to render audio, or should I do my mapping only once in the initialization.

It is set.

the ACL is seen as meta-data on the connection format - it is still the data format on the input/output that is the primary vehicle to describe the I/O. As this can't generally be changed when an AU is initialised, you can assume that the ACL will be stable as well

Thanks

Bill

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >Dealing with Channel Order in my Effect AudioUnits (From: Motti Shneor <email@hidden>)

  • Prev by Date: Re: Built-in audio devices unexpected behavior
  • Next by Date: Re: Maximum number of parameters?
  • Previous by thread: Dealing with Channel Order in my Effect AudioUnits
  • Next by thread: Parameter change
  • Index(es):
    • Date
    • Thread