I am already aware of most of this material. Unfortunately, none of this material seems to answer my questions.
Here's how I see it. The whole AudioUnit rendering process seems to be equivalent to using callbacks in AudioQueue (at least logically, and even programmatically in many ways). Just a bunch of different functions (So that part will, I believe, slip right into my existing code (doubtless after lots of aggravation on my part). I can also appreciate that Xcode might be a pain. These are all details which I believe I can handle.
However, as far as I can tell (and please correct me if I am wrong) none of these examples illustrate what I actually need and what I listed in my original email:
* How to get "signals" from Logic into the plugin - not external MIDI signals (which I can already do) but signals from the Logic software track into which the plugin is plugged. These would probably be called something like Logic "internal MIDI events" or something similar. There must be some function (functions?) to get these.
* How to get output generated by the plugin (created by the rendering) directed to the same instrument track (for instance, so it can be processed by EXISTING effects processors supplied with Logic, which I do NOT want to write).
* How to connect my plugin to automation.
Presumably all this involves figuring out how to fit into the AUGraph structure in LogicPro.
I already have a classy stand-alone synth - fancy GUI, nice DSP processing routines, etc etc etc. What I need is really simple and basic - bare-bones function calls that allow me to do what is listed above. I don't want C++ classes (I'm not using C++), I don't want convenience routines. All these things are just wrappers around C (objective C) functions. I want the relevant functions.
Again, thanks for your response, and (hopefully) for additional responses.
Perhaps, as a final comment, it will be necessary for me to pay an Apple engineer to help me with this. It is a sign of my desperation that I am actually willing to do this if I could be sure that, after spending the money, I would have my answer (and if Apple even offers this service). I'd even enroll in a course (if such a thing exists).
Anyway, thanks again.
Steve
On Apr 13, 2015, at 12:23 PM, Christian Rober <
email@hidden> wrote:
I would start with building, installing, running and setting debugger breakpoints in the example AudioUnits in this sample project (look for the different Render, Parameter and MIDIEvent/StartNote calls):
While doing that I would refer to this for clarification of both the higher level design and some notable details:
You will probably also want the Core Audio helper code files. They are partially included in the sample projects above, but it may best to "install" it in a more accessible/global location that is conducive to your workflow. Among other things, these give helpful c++ base classes for bootstrapping your own Audio Unit via the AU API:
One word of warning: getting the Xcode project/workspace configured for audio units can be tricky and has a few gotchas, especially with respect to older hosts. For example, there are many cases where the older component manager-based code paths require strict matching of properties and strings across multiple files.
I create mine from scratch and pattern-match the sample code environments, others copy and alter the existing ones. Do whichever works the best for you.
Good luck.
--Christian