AudioUnit configuration (discrete channels vs. linked channels)
AudioUnit configuration (discrete channels vs. linked channels)
- Subject: AudioUnit configuration (discrete channels vs. linked channels)
- From: Charles Constant <email@hidden>
- Date: Sun, 13 Mar 2016 21:29:00 -0700
Hope I'm not abusing the list with all my questions ...
Is there a standard way by which you can have a single instance of an AU process multiple elements discretely?
My app has code to host whatever FX Unit the user chooses. For the input buffer, it's an arbitrary number of tracks. The Effects window pops up, the App plays a preview. So far I have this working fine...
The trouble is that I want to give the user an option to have the selected buffers processed as either:
a) "Discrete" tracks. Eg: if it's Reverb with 2 input channels, neither of the 2 output channels have reverb bleeding from the other.
b) "Linked" tracks. Eg: track 1 might have some reverb from track 2, if the effect implements that.
Here's my Graph at the moment:
(my render proc for input) -> FxAudioUnit -> MatrixMixerAudioUnit -> SystemOutputAudioUnit
From what I can tell, my current setup won't do "discrete."
Is there a way to do this that doesn't involve creating n copies of the FX unit for each channel? The only thing that looks like it might be relevant to me is setting an AudioChannelLayout. Am I on the right track there?
Thanks,
Charles
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden