Audio Unit input pitch based MIDI output
Audio Unit input pitch based MIDI output
- Subject: Audio Unit input pitch based MIDI output
- From: Frank Schoep <email@hidden>
- Date: Wed, 27 Dec 2006 19:30:10 +0100
Hello everyone,
As this is my first post to the mailing list, allow me to introduce
myself briefly. If you want to skip to the question, see the "---"
mark below.
As you've probably guessed, my name is Frank Schoep and I've been
creating some pitch-shifting Audio Units lately which are available
with full source code on my website:
http://www.ffnn.nl/pages/projects.php
I got interested in Audio Unit programming when I purchased my first
Apple computer, a Macbook, at the end of May this year. As I'm the
only guitarist in my band I'm always in for some great live effects
to juice up my sound and so far I've been really impressed by my
Macbook's possibilities.
I started out by creating a reverse delay effect, then a simple pitch
shifter and my latest creation is a diatonic pitch shifting effect.
You might know Stephan M. Bernsee, the creator of the DSP Dimension
website, he helped me get up to speed with Fourier transforms and the
basics of Audio Unit programming.
I'm really happy with the CoreAudio architecture and the tools and
APIs provided by Apple. AU Lab is a beautiful application that has
seen many hours of live usage on my Macbook already and I'm always
looking to push it further. That's why I'm writing to the list now.
---
My diatonic pitch shifting effect uses a simple note detection
algorithm based on the forward Fourier transform and works perfectly
for monophonic input signals. Based on this detection algorithm I
want to create an Audio Unit that generates MIDI notes to drive a
synthesizer.
I've read through the CoreAudio Overview document and looked at the
Xcode examples for "PlaySequence", "PlaySoftMIDI" and "SampleTools"
and I've got quite a good view on how to use the CoreAudio API for
MIDI purposes. What I lack, however, is a solid background in MIDI
and the terminology.
I'd really appreciate it if you could explain to me how to implement
the MIDI part of an Audio Unit that generates MIDI notes based on
audio input. I'm thinking about something like this:
1 - in the AU constructor: create a MIDI "device" or "output" so that
other applications (and Audio Units like DLSMusicDevice) can see my
AU as a MIDI source
2 - while processing: send MIDI events at the moments a note should
be played
3 - in the AU destructor: clean up the MIDI "device" or "output"
Which API calls and structures should I use for each of these steps?
I'm not sure if I want to create a "virtual" MIDI source or a
"physical" one and what the consequences would be. I'm not looking to
mix audio and MIDI in a synchronized fashion. I simply want to
generate MIDI events and other applications (DLSMusicDevice) should
use my Audio Unit as the MIDI source.
If parts of my question are unclear or if you think I'm approaching
the problem the wrong way, I'd be glad to discuss it. I know that
most of you will have more experience with this than I have, so your
feedback is greatly appreciated.
Thanks a lot in advance for your responses and best wishes for the
new year in advance.
With kind regards,
Frank Schoep
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden