Re: Sending MIDI to virtual instruments
Re: Sending MIDI to virtual instruments
- Subject: Re: Sending MIDI to virtual instruments
- From: William Stewart <email@hidden>
- Date: Tue, 18 Mar 2008 18:44:49 -0700
On Mar 18, 2008, at 5:12 PM, Todd Blanchard wrote:
As I understand it, AUMIDIController used to take care of buffering
and timely feeding of MIDI messages to virtual instruments. This
thing is now marked deprecated in favor of MusicPlayer and
MusicSequence.
Not really.
AUMIDIController was connecting to the CoreMIDI endpoints and then
feeding the midi messages to the audio units you wired into this. It
was pretty straight forward and not really doing that much more than
that. We are actually in the process of writing a simple example app
that does the basics of MIDI input parsing, etc...
MusicPlayer is an object that plays a sequence of music events - it is
conceptually very different.
I've been playing with building an arpeggiator and want to fiddle
the timestamps on some MIDI events to deliver them later than they
occur (and perhaps repetitively as well). Currently, since my app
is all about live performance and has zero recording or playback
features, I just get the MIDI events from the system, buffer them
briefly, and feed them all to the AudioUnit using
MusicDeviceMIDIEvent in the device's rendering notification
callback. Seems to work fine for 'direct control'.
Yep - that's what you do
Apparently, this call completely ignores the timestamp on the event
and simply executes the action associated with the event NOW. At
least that is what I'm observing from trying to create a 'flam'
effect by sending duplicate note events one octave higher and with
timestamps set (I think) half a second later.
The call explicitly states "sample offset" and this is the sample
offset within the next buffer of audio that will be rendered. Audio
units do not schedule events past the current/next buffer they render
(neither did AUMIDIController)
Is there a facility for scheduling events later (when I say 'later'
I mean half a beat later or so) or do I just have to do it myself?
I keep seeing references to using MusicPlayer and MusicSequence -
but I can't see how this relates to a live situation. All the
writing seems to relate to playing back a file of events. I'm
getting them in real time.
You can use the sequence player for this - you can get your midi
event, have the player playing, use the host time to beats call to
translate the midi time stamp to a beats value - then add the number
of beats to that value (presume that that value is basically now) - to
make your event played say 5 seconds later, then it will be.
(the music player is playing and the sequence track you are going to
add events to is targetting your audio unit)
So:
midi packet host time stamp from Core MIDI
music player host time to beats -> "now" beat time
now beat time + 5 (lets assume that your tempo is 60 bpm, so 1 beat ==
1 second)
add music events that you want with the new beat time
when the player gets to that time, it will "play" the event for you
Bill
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden