I have created a complex and
sophisticated synthesizer with objective C, Cocoa and AudioQueue.
(For just onesimple example, frequencies, amplitudes and timbres can
fluctuate on sub-millisecond up to seconds time scales. There is no
start/end latency.) The synth
receives MIDI signals from a LogicPro external MIDI track and outputs
to standard out (which is piped back to a Logic audio track via an
audio interface). Objective C classes organize the modules of the
project, while time-critical variables and calculations use C. There
is no C++ (which I know). It runs in X-code.
I would like to make this synth an au
plugin app for LogicPro, and am looking for MINIMAL help or examples
to guide me. I think I need only a few things. POINT ME TO THE
APPROPRIATE FUNCTIONS (see below) and I will figure out the rest.
I'm just asking for a few words. I will use audio graphs if
necessary.
1. What is the C/Obj C function (or
functions) needed to capture signals from the Logic software tracks
into which the plugin is inserted? (These would replace/expand MIDI
signals with Logic signals.)
2. What is the C/Obj C function (or
functions) to send the output directly to the instrument track?
3. What is the C/Obj C function (or
functions) required for automation?
4. How do I create the app bundle and
where do I put it?
5. I have been informed that Audio Queue cannot be used in an au plugin. Is this true? Is there an alternative that operates in essentially the same manner?
Currently, the synth is for my own
personal use for composing opera and songs using custom instrumental
sounds. I am not fooling around. See soundcloud.com/steven-brawer for
a sci-fi opera composed entirely with Logic 9.
I use XCode 5.1.1, Logic Pro 10.0.7,
OSX 10.8.5. I plan to upgrade to Yosemite, Logic 10.1 and the most
recent XCode when I see from the forums that things have settled down
sufficiently.
Thank you in advance.