• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: AUGraph deadlocks
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: AUGraph deadlocks


  • Subject: Re: AUGraph deadlocks
  • From: Brian Willoughby <email@hidden>
  • Date: Mon, 05 Dec 2011 02:23:26 -0800


On Dec 4, 2011, at 15:26, patrick machielse wrote:
The information is coming from the user, but not through any AudioUnit UI. My custom audio units don't have a UI, and I'm not using the default UI of any other bundled unit.
You seem to be making the assumption that there is something special about the custom UI of an AU. All communication with an AU engine is supposed to occur exclusively through the parameter and property API. There are many examples where an AU host will directly call this API without being part of some plugin UI or default UI. As far as I understand, the AU engine should not have access to any outside data except via the official Set/Get API.

At 'pre-render' time, the render thread first checks if there is a new processing 'recipe' available, and then updates all units in the graph according to the recipe for the current render time, using AUBase API.

You've violated the separation of AU engine and non-engine code by accessing your 'recipe' data from within the engine. The parameter and property API are designed to cross the CoreAudio thread boundaries for you, so that none of this 'recipe' code needs to run inside any kind of render routine, pre-render or otherwise.


Instead, what you should do is have your non-AU recipe processing send SetParameter and/or SetProperty calls to your AU any time that things change. There's no need to move this code inside the pre- render routine - that just causes avoidable issues. Keep in mind that AudioUnitSetParameter() has inBufferOffsetInFrames to control timing of parameter changes.

Later, if you decide to add an AU UI, the notifications for these parameter changes will allow the UI to stay in sync with your non-AU code by registering for the notifications, but the presence of a new UI will not change the basic mechanism. In fact, there's no particular need to ever have a custom or even default UI.

There is a clear division between the UI and the audio processing engine. At some point the render settings must be applied to the audio units. kAudioUnitRenderAction_PreRender seemed the best opportunity to me.

There might be a clear division between the UI, per se, and the audio processing engine, but there is not any division between your non- engine data (the 'recipe') and your engine data. AudioUnits are designed such that the engine should never access any data except for members of its own object instances. Any time your engine object member variables get out of sync with your recipe, you need to call the Set/Get API from outside the engine to bring it up to date.


Keep in mind that there are multiple steps to changing a parameter, some of which are partially hidden by the default class code in an AU. The first step is that some piece of non-engine code calls SetParameter with appropriate parameter values. The next step is that the AudioUnit code uses these parameters to alter local member variables. The final step is that the Render routine accesses the member variables to determine how to process the audio correctly according to the real time parameters.

What you've done is expand the render process so that it now needs to access data outside your object's members. This creates all kinds of threading issues including access of data from multiple threads. Just because you're doing this in PreRender does not make it a good idea. Basically, I see no reason why you would have to do things the way you're doing them now. All of that pre-render code needs to be moved outside your engine.


Basically, it seems like you're confusing the responsibilities of an AU host with the responsibilities of AU engine (pre)rendering code. Unless I'm missing something huge, you'd be far better off following normal CoreAudio coding techniques.


Brian Willoughby
Sound Consulting

P.S. Another confusing issue here is the distinction between the implementation of an AudioUnit versus the coding techniques needed to host an AUGraph. From the point of view of a host application that is a client of AUGraph, you probably would start out in your AUHAL render callback by sending various AudioUnitSetParameter() calls to the various AudioUnits in your graph. This is totally separate from the pre-render call of any particular AU. There are still limitations on what you can call. What's confusing to me is why you need to handle AU host duties inside the pre-render routine of an AU plugin.

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: AUGraph deadlocks
      • From: patrick machielse <email@hidden>
References: 
 >Re: AUGraph deadlocks (From: patrick machielse <email@hidden>)

  • Prev by Date: Audio is not resuming after the phone call
  • Next by Date: Re: AUGraph deadlocks
  • Previous by thread: Re: AUGraph deadlocks
  • Next by thread: Re: AUGraph deadlocks
  • Index(es):
    • Date
    • Thread