Handling interactions between channels in an AU
Handling interactions between channels in an AU
- Subject: Handling interactions between channels in an AU
- From: Paul Cantrell <email@hidden>
- Date: Thu, 24 Mar 2005 16:12:59 -0600
Please pardon my ignorance while I ask a very basic question about
writing AUs.
I'm trying to write a very simple stereo imaging AU that deals with
LR/MS processing, etc. As such, there are interactions between
channels, and I need to deal with both input channels simultaneously in
my processing method.
The sample code in SampleEffectUnit::SampleEffectKernel::Process seems
to be saying very clearly that MyKernel::Process will receive
interleaved samples:
while (nSampleFrames-- > 0) {
Float32 inputSample = *sourceP;
sourceP += inNumChannels; // advance to next frame (e.g. if stereo,
we're advancing 2 samples);
// we're only processing one of an arbitrary number of
interleaved channels
...etc...
However, reading the source for AUEffectBase and company, it really
looks like it's instantiating one kernel per channel.
The documentation is -- with apologies to the AU team -- really piss
poor. (I'm actually kind of surprised -- usually Apple puts out better
stuff.) Parts of it seem to say that all AUs receive deinterleaved
samples; other parts seem to say that they should support interleaved
or deinterleaved. There is no real explanation of implementing
MyKernel::Process to speak of, other than the sample code I pasted
above.
Looking around the code and the web, it looks like I need to override
AUEffectBase::Render or AUEffectBase::ProcessBufferLists or ...
something.
Can anybody point me in the right direction?
Cheers,
Paul
_________________________________________________________________
"Prediction is hard, especially of the future." -- Niels Bohr
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden