• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Audio Unit input pitch based MIDI output
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Audio Unit input pitch based MIDI output


  • Subject: Re: Audio Unit input pitch based MIDI output
  • From: William Stewart <email@hidden>
  • Date: Mon, 22 Jan 2007 12:19:39 -0800

This approach does have its merits - however it doesn't allow for a really robust way to synchronise a MIDI event to an audio stream that is being processed by an AU.

So, we've added a MIDI callback facility to the AU spec - this will need work on both the AU and the host app side, but ultimately it provides the best long term solution. Essentially this will work through the host registering a callback with an AU that can provide MIDI output, and then while the AU is in its render call, it can call this callback with any MIDI data it has generated. The MIDI data is time stamped in relation to the buffer being processed by the AU. The AU can also register multiple MIDI outputs as appropriate.

The API will be published in Leopard - but I"m happy to discuss details privately. If you are interested, please contact me directly.

Thanks

Bill


1 - in the AU constructor: create a MIDI "device" or "output" so that other applications (and Audio Units like DLSMusicDevice) can see my AU as a MIDI source

Yes, create a virtual MIDI source (MIDISourceCreate).
For that you have to instantiate a MIDI client first (MIDIClientCreate).
Your AU will act then as a MIDI source, from which other MIDI clients can connect and receive events.
You should consider the case where multiple instances of your plugin are running together. In MIDISourceCreate, you give a name to the virtual source, but this name should remain the same so that other client can reconnect to it later. At the same time if every sources you create are named the same, then how will it be possible to identify the right plugin? This is not very important, because you'll have multiple copies of the same input name, but it could confuse the user, I think.



2 - while processing: send MIDI events at the moments a note should be played

Here you have to use build a packet list and call MIDIReceived() in order to tell CoreMIDI what is the data you want to send from the virtual source. All the functions to build packet lists are available in CoreMIDI, this is very easy to do.



3 - in the AU destructor: clean up the MIDI "device" or "output"

Just clean up the objects you instantiated with CFRelease, that is the midi client and the virtual source.


On 27/12/2006, at 10:30 AM, Frank Schoep wrote:

Hello everyone,

As this is my first post to the mailing list, allow me to introduce myself briefly. If you want to skip to the question, see the "---" mark below.

As you've probably guessed, my name is Frank Schoep and I've been creating some pitch-shifting Audio Units lately which are available with full source code on my website:
http://www.ffnn.nl/pages/projects.php


I got interested in Audio Unit programming when I purchased my first Apple computer, a Macbook, at the end of May this year. As I'm the only guitarist in my band I'm always in for some great live effects to juice up my sound and so far I've been really impressed by my Macbook's possibilities.

I started out by creating a reverse delay effect, then a simple pitch shifter and my latest creation is a diatonic pitch shifting effect. You might know Stephan M. Bernsee, the creator of the DSP Dimension website, he helped me get up to speed with Fourier transforms and the basics of Audio Unit programming.

I'm really happy with the CoreAudio architecture and the tools and APIs provided by Apple. AU Lab is a beautiful application that has seen many hours of live usage on my Macbook already and I'm always looking to push it further. That's why I'm writing to the list now.
---
My diatonic pitch shifting effect uses a simple note detection algorithm based on the forward Fourier transform and works perfectly for monophonic input signals. Based on this detection algorithm I want to create an Audio Unit that generates MIDI notes to drive a synthesizer.


I've read through the CoreAudio Overview document and looked at the Xcode examples for "PlaySequence", "PlaySoftMIDI" and "SampleTools" and I've got quite a good view on how to use the CoreAudio API for MIDI purposes. What I lack, however, is a solid background in MIDI and the terminology.

I'd really appreciate it if you could explain to me how to implement the MIDI part of an Audio Unit that generates MIDI notes based on audio input. I'm thinking about something like this:

1 - in the AU constructor: create a MIDI "device" or "output" so that other applications (and Audio Units like DLSMusicDevice) can see my AU as a MIDI source
2 - while processing: send MIDI events at the moments a note should be played
3 - in the AU destructor: clean up the MIDI "device" or "output"


Which API calls and structures should I use for each of these steps? I'm not sure if I want to create a "virtual" MIDI source or a "physical" one and what the consequences would be. I'm not looking to mix audio and MIDI in a synchronized fashion. I simply want to generate MIDI events and other applications (DLSMusicDevice) should use my Audio Unit as the MIDI source.

If parts of my question are unclear or if you think I'm approaching the problem the wrong way, I'd be glad to discuss it. I know that most of you will have more experience with this than I have, so your feedback is greatly appreciated.

Thanks a lot in advance for your responses and best wishes for the new year in advance.

With kind regards,

Frank Schoep
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden

--
mailto:email@hidden
tel: +1 408 974 4056
________________________________________________________________________ __
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________ __


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Audio Unit input pitch based MIDI output
      • From: "William Cotton" <email@hidden>
  • Prev by Date: Re: Vector Based Panning with Hexagonal or Octagonal?
  • Next by Date: Re: Add/Remove Render Notify more than once
  • Previous by thread: Re: Vector Based Panning with Hexagonal or Octagonal?
  • Next by thread: Re: Audio Unit input pitch based MIDI output
  • Index(es):
    • Date
    • Thread