Scoping a MIDI project
Scoping a MIDI project
- Subject: Scoping a MIDI project
- From: Chris Adamson <email@hidden>
- Date: Mon, 05 Sep 2011 15:45:52 -0400
I'm sizing up the MIDI-related APIs, and I'm a little surprised to see what looks like a gap between the APIs for processing MIDI events from hardware (Core MIDI) and for sending MIDI commands to instruments units in an AUGraph (MusicDevice.h).
Just so I don't miss an easy win somewhere, someone please answer a few factual questions:
1. Is it the case that to play from a hardware MIDI device to an instrument AudioUnit, I will need to handle the CoreMIDI events in a read proc, and then manually send my own MusicDeviceMIDIEvent() calls to the instrument unit(s)?
1.5. Can I assume this is what, say, GarageBand does when an external MIDI device is attached and played? In other words, is this what everyone does, or is there some really easy MIDI-device-to-instrument-unit win that I'm overlooking?
2. Is there any convenience API to make converting the MIDIPacket (which consists of a timestamp, length, and byte[]) into the inStatus, inData1, and inData2 args needed by MusicDeviceMIDIEvent()? Is this just thought to be so trivial that it doesn't need a convenience API?
(I haven't looked at the format of the MIDI packet… I imagine that's next)
These questions are largely rhetorical:
3. MusicDevice.h looks like it's been in the Mac OS X SDK since at least 10.3. Shouldn't it be in the documentation bundle by now?
4. A lot of the MIDI related sample code doesn't build as-is on Lion. PlaySoftMIDI needs to be converted to use the AudioComponent API. PlaySequence has 30+ build errors that seem to be from C++ imports in the PublicUtility classes. bugreport.apple.com?
Thanks in advance!
--Chris
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden