The structure of an iOS AudioUnit app
The structure of an iOS AudioUnit app
- Subject: The structure of an iOS AudioUnit app
- From: Schell Scivally <email@hidden>
- Date: Tue, 1 Feb 2011 00:08:39 -0800
Hi all, this is my first post to this list - thanks in advance for
taking the time to read this.
There's a gap in my very novice understanding of the structure of an
iOS AudioUnit hosting application. I've seen numerous example
applications that create an AUGraph, usually consisting of a RemoteIO
unit and one more unit (Converter or Mixer). The custom DSP is done in
a render callback attached to one of the two component descriptions,
more often the RemoteIO unit. What I'm lacking is an explanation of
how to write arbitrary Audio Units to chain together. Should I write
full fledged Audio Units (a .component) like the TremeloUnit example
project? Can I use those in an iOS app? Will I have to roll my own
'graph' that hooks into a main render callback?
I think I'm confused because instantiating an Audio Unit is done in a
black box. I see the AudioComponentDescription being used to retrieve
a reference to an Audio Unit that lives in the graph, but I don't see
how I can get a reference to a custom Audio Unit that I've written
separately.
--
Schell Scivally
email@hidden (email@hidden)
http://blog.efnx.com
http://github.com/efnx
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden