Skinny Tod, Glad to be of help.
>From reading the CoreAudio docs it seems like it would be possible to >specify multiple endpoints - for my needs: a DLSMusicDevice (to play the >sound) and (maybe) a virtual port - to get the actual event data in order >to display it. I may have this wrong though- tinkering with it now.
I've never tried to work with MIDI Endpoints, so I'm interested to hear how that works out. If it isn't entirely satisfactory, consider:
If you need the DLS Synth, my first thought is to use an AUGraph. PlaySequence should provide guidance in setting up the default AUGraph with DLS synth. The nice thing is that the connections between the MusicSequence and the AUGraph are done for you. (Unless you want to play to a Plugin synth, then you would need to make your own nodes and connections.)
Since my app is all about algorithmic manipulation of MIDI note data, I always have the MIDI events at hand, in my TrackObject and NoteObject classes, and therefore just inspect the NoteObjects' timestamps for when to display their data.
So I'm just speculating, but I think it might work to use the MusicEventIterator (each of which gets associated with a track, as from a MusicSequence). The MusicTrack API doesn't seem to allow you to get the individual events, nor does the MusicSequence. That seems to leave the MusicEventIterator. I've seen some postings about them on this list, so do a search and see. The most useful method would probably be this one:
MusicEventIteratorGetEventInfo Gets information about the event at a music event iterator’s current position. OSStatus MusicEventIteratorGetEventInfo (
MusicEventIterator inIterator,
MusicTimeStamp *outTimeStamp,
MusicEventType *outEventType,
const void **outEventData,
UInt32 *outEventDataSize
);
Then, in the main thread NSTimer callback you could display data for whichever event is up next. That's not a small project, but from my limited experience looks like a way you may get what you want.
Keep us posted...
David |