Re: Two AUs in one bundle
Re: Two AUs in one bundle
- Subject: Re: Two AUs in one bundle
- From: ipmlists <email@hidden>
- Date: Wed, 9 Sep 2009 16:33:35 +0100
2009/8/28 William Stewart <email@hidden>:
>
> On Aug 26, 2009, at 6:32 AM, ipmlists wrote:
>
>
> 2009/8/25 William Stewart <email@hidden>
>>
>> On Aug 24, 2009, at 5:09 AM, ipmlists wrote:
>>
>>> I'd like to combine two AUs into one bundle, so they're launched
>>> together, and both are controlled from one Cocoa view. From searching the
>>> archives, I _think_ this is possible, (this thread, for example:
>>> <http://lists.apple.com/archives/Coreaudio-api/2008/May/msg00108.html> and
>>> the earlier one it refers to) but it would be nice to get that confirmed.
>>>
>>> If so, then how do I specify their connectivity - by bundling them
>>> inside an AUGraph inside an AU?
>>
>> All that a host sees is the audio unit that you publish. How you manage
>> the internal state is up to you - it can be as simple or complex a chain as
>> you need
>>
>> Bill
>
> Thanks Bill. I'm afraid I'm still confused as to the plumbing of this. I now
> have a third AU, which contains in its constructor the AUGraph setup code
> (the graph is MyAU1 -> MyAU2 -> Generic I/O AU).
>
> ok
>
> What about input/output? For the output, is it just a matter of making sure
> the stream formats of my 'container' AU and the I/O AU match?
>
> the output of your I/O AU needs to match the output of the wrapper AU
> the input of your wrapper AU needs to match the input format of MyAU1.
> I would start by making ALL of the formats the same - stereo, 44.1,
> deinterleaved (which is the default). If your wrapper AU's Initialize call,
> you are going to have to make sure your graph is all set up with your
> formats, etc. I would do that and then in your wrapper AU's Initialize call,
> that's where you initialize the graph (and uninitialise in the DoCleanup
> call
>
>
> I'm guessing I then need to override ProcessBufferLists, but again, I'm not
> sure how...
>
> You have to:
> (1) call AudioUnitRender on your graph -
> (2) that will eventually go and call your input proc from MyAU1, from there
> you go and get the input in your wrapper au
> (3) then when that returns you have your audio going through your graph
> (4) it comes out at the end and you pass it back to the caller of your
> wrapper AU
> The PlaySequence code should help you - this examples pulls audio through a
> graph to write a file (Developer/Examples/CoreAudio/SimpleSDK)
> Bill
I've just been able to return to this after some time away, and have a
sinking feeling that I'm not really getting it at all.
Most basically, I realise I've been seeing the architecture as the
wrapper AU 'enclosing' the AUGraph, so the connections are from
wrapperAU input to graph input, then from graph output to wrapperAU
output. Which (I'm guessing) is obviously wrong, since you can't
connect inputs to inputs, etc. So how _do_ you wire up the wrapper AU
to the graph inside it? And when you say "call AudioUnitRender on your
graph", do you mean call it on the Generic I/O AU? (This is such a
basic confusion I'm embarrassed to admit it, but I'm getting nowhere
going through docs.)
The other big area of confusion is the input proc - I don't understand
its exact role, when it is and isn't required, or why. Staring at
TN2091 probably should have helped, but hasn't so far :)
Sorry for the basic questions; if anyone wants to point me at
appropriate FMs to R, I'll be grateful, and take the hint. (I have
read the obvious stuff, like AU programming guide and CoreAudio
overview.)
And thanks again BIll, also for your reply to the 'using an AU
directly' thread, which did clear up a few things for me!
Iain
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden