• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Sequencer as AU
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Sequencer as AU


  • Subject: Re: Sequencer as AU
  • From: Peter Johnson <email@hidden>
  • Date: Tue, 06 Jan 2009 22:30:16 +1100

Matt,,

I have done exactly what you describe, making what was a standalone sequencing application an AU.

My sequencer, a form of step sequencer, is designed so that the sequencer engine is all in C++ while the view is in Objective-C. As a Cocoa AppKit app this is a NSView, but as an AU, this is a Cocoa UI bundle made with the view factory and AUCocoaUIBase.

I chose to make my sequencer AU an instrument which does zero processing in the audio rendering method (actually I think I just zero out the audio buffer to prevent any potential glitching). This is also where I gather timing information from the host with functions such as CallHostBeatAndTempo. This is then passed into the sequencer engine. The AU part being in C++, I don't think you can make a pure Objective- C AU, it may be possible, but I inherit from AUMonotimbralInstrumentBase which is all C++.

The problem comes when you actually want to sequence something. The logical thing (no pun) in Logic would be to place your sequencer as an effect plug-in in the channel strip prior to the instrument. However, there is currently no way of passing the MIDI down the channel strip chain, in the same way you can process the audio. Instruments like Ultrabeat have the sequencer all self contained, the MIDI can't exit. Therefore, I chose to go the sequencer as instrument route and have the AU construct virtual MIDI ports in the same way you would with a standalone application (in fact this code is identical between app and AU). When the AU is instantiated by Logic, the virtual ports get created, Logic acknowledges them by informing the user with a dialogue box and the ports are then added into the Environment. You can then wire up the Environment to have the sequencer AU pass MIDI to an instrument elsewhere either internally or to external gear. This is not a perfect solution but it works and the timing is good. In comparison, VST3 allows you to process MIDI and the AU API has something similar but DAW support is not yet there. I'm hoping that will change so that we can write other AU's that process audio + MIDI.

There was one other catch I found, possibly as a result of my architecture, but also I think as a general approach with the MVC pattern being used to separate the Cocoa AU view from the actual AU. This involved sending UI events back into my sequencer engine. I solved this by using an AU property which gets passed around between what is effectively the C++ AU and the Objective-C UI.

The other issue to watch out for is how to save/load your sequencer data.

Hope that contained something useful. My sequencer AU is available on the web, if you want to try it or need more info, let me know.

cheers
peter

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Prev by Date: Re: Sequencer as AU
  • Next by Date: OSX 10.5.6 breaks our user-land driver...
  • Previous by thread: Re: Sequencer as AU
  • Next by thread: Re: Assert Failure for AUListenerAddParameter
  • Index(es):
    • Date
    • Thread