Implementing ramped parameters in AU effects
Implementing ramped parameters in AU effects
- Subject: Implementing ramped parameters in AU effects
- From: Chris Rogers <email@hidden>
- Date: Tue, 3 Feb 2004 12:35:57 -0800
There has been some interest on the list recently about how to deal
with ramped parameters
when implementing AudioUnits. To start with I'd like to focus on
dealing with these
in effects, because it's easy and because probably 90% or more of you
only care about effects
(as opposed to other types of AudioUnits such as mixers).
There are some methods in AUBase and AUEffectBase which do almost all
of the dirty work for
you. You shouldn't need to override these when dealing with effects.
But it's necessary
to understand, in general, what they do. When a host makes a call to
AudioUnitScheduleParameters(),
providing an array of scheduled parameter events, the AudioUnit must
divide up your next
render buffer (when AudioUnitRender() is called ) into smaller pieces
based on the timestamps
provided by the scheduled parameter events. For effects, ultimately
this results in your
AUKernelBase::Process() method being called several times during this
render cycle, with the
source/destination pointers and number of frames to process adjusted
to point to smaller
sequential contiguous pieces within the larger render buffer.
Luckily for the AudioUnit effect developer, all the DSP code really
cares about for each of
these slices for ramped parameters is:
* start value of parameter
* end value of parameter
In the case of an "immediate" scheduled parameter the start and end
values will be equal.
In fact, for all non-ramped parameters the start and end values will
also be equal.
Here's a quick code example of how to get the start, end, and
convenience "delta" value in the AUKernelBase
Process method:
void AUHipass::HipassKernel::Process( const Float32 *inSourceP,
Float32 *inDestP,
UInt32 inFramesToProcess,
UInt32 inNumChannels,
bool & ioSilence)
{
Float32 startFrequency;
Float32 endFrequency ;
Float32 delta;
// get scheduled param info
mAudioUnit->Globals()->GetRampSliceStartEnd(
kHipassParam_CutoffFrequency,
startFrequency,
endFrequency,
delta );
if(startFrequency == endFrequency )
{
// optimized code for unchanging frequency
.... code omitted
}
else
{
// deal with ramping frequency value
double freq = startFrequency;
int n = inFramesToProcess;
float *sourceP = inSourceP;
float *destP = inDestP;
while(n--)
{
GetFilterCoefficients(freq, .....);
freq += delta;
float input = *sourceP++;
// process input based on filter coefficients
........
float output = ........
*destP++ = output;
}
}
}
Notice that in this example, the frequency value is ramped linearly.
It may be preferable
to implement an exponential ramp (for a frequency parameter) in your
DSP code, depending on your application.
As in the example above, you may want to implement special case DSP
to optimize for the case when the
parameter value is unchanging.
If anybody is sincerely interested in wanting to know how to deal
with ramped parameters for non-effect
AudioUnits, I can send out a rather more complicated email about this...
It's important to note that the DSP code cannot rely on altivec
alignment of the source and destination
buffer when ramped parameters are being used because the parameter
events are sample-accurate and
may fall arbitrarily across the altivec grain. Ordinarily, if ramped
parameters are *not* published
then AUKernelBase::Process() will have the source and destination
buffers altivec aligned.
I Hope this gets people started. All you should need is a host that
will make use of ramped-parameters
by calling AudioUnitScheduleParameters() to test out your new code.
Chris Rogers
Core Audio
Apple Computer
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.