At the very least, you should design your program so that you schedule MIDI events slightly ahead of time. The reason for this is that some MIDI interfaces can take time-stamping MIDI data directly at the hardware level, and then deliver the messages precisely on time. This potential feature depends upon a correctly-implemented MIDI driver which has an accurate connection to the CoreAudioClock, but folks who make such MIDI interfaces are undoubtedly doing this. I'm talking about the multi-port MIDI interfaces with custom USB drivers.
I'm not sure how far in advance you should schedule things, but I guess the limit is that if the user presses stop, and queued MIDI events will still fire. As long as the scheduling is short enough that the transport delay is not objectionable, then you should make sure you're feeding MIDI events significantly ahead of the playback position.
For virtual MIDI instruments, this can help because it would allow MIDI events to be scheduled on a sample-accurate basis within the rendering buffer.
The key is that audio is scheduled according to the CoreAudioClock, and CoreMIDI also is scheduled according to the same clock. Therefore, you should be running your application clock on the same reference.
Brian Willoughby
Sound Consulting
On Dec 8, 2008, at 14:41, Carlos Eduardo Mello wrote:
On Dec 8, 2008, at 8:20PM, William Stewart wrote:
core audio clock gives you a much finer resolution - but given your description below, I'm really not at all sure what you are doing, and it seems to me that you have already decided on how you should basically implement what ever it is you are doing
Basically, I am writting a sequencer application. Nothing too fancy, just a basic multi track sequencer.
The interesting stuff is in the user interface and in a couple of specific features on data edditing.
And no. I haven't decided on anything related to playback or input. I am just just slowly experimenting with the CoreMIDI API to see what I should use. So your help will be mostly invaluable.
- Let me see if I understand your suggestion: I should use the CoreAudio clocking facility in order to schedule my MIDI messages?
- Are there any samples on that specific type of code?
- Is it necessary to use threads other than the main? (the reason I ask is because I will need keep track of playback time in order to display visual cues for the user...)
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (
email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to
email@hidden