Re: AUGraph deadlocks
Re: AUGraph deadlocks
- Subject: Re: AUGraph deadlocks
- From: patrick machielse <email@hidden>
- Date: Mon, 05 Dec 2011 17:02:01 +0100
Op 5 dec. 2011, om 11:23 heeft Brian Willoughby het volgende geschreven:
> On Dec 4, 2011, at 15:26, patrick machielse wrote:
>
>> At 'pre-render' time, the render thread first checks if there is a new processing 'recipe' available, and then updates all units in the graph according to the recipe for the current render time, using AUBase API.
>
> You've violated the separation of AU engine and non-engine code by accessing your 'recipe' data from within the engine. The parameter and property API are designed to cross the CoreAudio thread boundaries for you, so that none of this 'recipe' code needs to run inside any kind of render routine, pre-render or otherwise.
I'm afraid I might not have been clear about outlining my implementation or confusing in my use of the term 'rendering engine'.
Schematically, my "Audio Processing Component" works like this:
====
function: renderCallback {
if ( kAudioUnitRenderAction_PreRender ) {
- get new processingRecipe (synchronized) if there is one available
- adjust settings for audio units in the processing graph for current time
}
}
@interface ProcessingEngine
{
AUGraph graph;
id processingRecipe;
}
@end
@implementation ProcessingEngine
- start {
- create graph
- install renderCallback on the graph output unit (either generic output or default output)
- AUGraphStart(graph);
}
@end
===
The processingRecipe prescribes the AudioUnit settings to use f(t). The settings change continuously during processing (whether the user changes the recipe or not) and should be adjusted on each render loop.
Performing these adjustments from the 'outiside' seems to be harder to achieve than performing them from the renderCallback function. Also, the implementation of my custom audio unit would require more care (treading wise).
> Instead, what you should do is have your non-AU recipe processing send SetParameter and/or SetProperty calls to your AU any time that things change. There's no need to move this code inside the pre-render routine - that just causes avoidable issues. Keep in mind that AudioUnitSetParameter() has inBufferOffsetInFrames to control timing of parameter changes.
The documentation suggests that this facility shouldn't be used and that AudioUnitScheduleParameters() should be used instead. By using AudioUnitScheduleParameters() a part of the parameter setting task could be moved to a different location / thread, at the cost of more complicated code. Also, it would not be a solution for all parameters because there is some dynamic feedback between processing and control, which means not all audio unit prameter settings can be known ahead of time.
patrick
--
Patrick Machielse
Hieper Software
http://www.hieper.nl
email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden