Re: CoreAudio / AudioUnit
Re: CoreAudio / AudioUnit
- Subject: Re: CoreAudio / AudioUnit
- From: Peter Rebholz <email@hidden>
- Date: Tue, 25 Mar 2008 10:23:24 -0500
Alex,
Am I right in assuming that you are using your audioBuffer to populate the ioData AudioBufferList in a render callback? If so, you should just be able to connect your AUMatrixReverb unit to the output unit (using AUGraphConnectNodeInput) and register the render callback on the AUMatrixReverb unit instead of the output unit. The CAPlayThrough example has something similar to this where the render callback property is set for a varispeed unit which is connected to the output unit.
Peter
Hi everybody.
I guess it's the best place to put my request but perhaps you can help
with a CoreAudio / AudioUnit question.
I have an application, working fine at the moment, generating an audio
buffer and send it to CoreAudio.
To do so I had to implement a call back and so on.
Until there, no probem.
Now I want to add a new extra feature to my application.
Actualy the application behave like this :
audioBuffer -> CoreAudio Output
However I want it to behave like this :
audioBuffer -> AudioUnit AUMatrixReverb -> CoreAudio Output.
I'd need more information concerning how to handle with AUGraph.
I looked at many sample but none gave me what I really expected.
At the same time, I also found that application are using, in the case
of AUMatrixReverb, the same interface to handle it.
Is there a way to retrieve this "setting window" so I can use it to
control the added reverb.
Of course, being able to add it in a NSView would be nice too. ;)
Thank for your reading
Alex ROUGE |
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden