Re: Questions on data in MIDIPackets
Re: Questions on data in MIDIPackets
- Subject: Re: Questions on data in MIDIPackets
- From: Herbie Robinson <email@hidden>
- Date: Mon, 18 Feb 2002 22:01:54 -0500
[ >If you consider the exact case of a Human Interface Device (keyboard,
[ >expression controller, etc.) connected via MIDI 1.0, it seems like
[ >it would be
[ >a good thing if the CoreMIDI driver subtracted 320 microseconds from the
[ >time stamp of the receive interrupt and reported that adjusted time as the
[ >actual absolute time of the real event.
[
[ That's what I had in mind, too...
[
[ >This seems like a way to improve the accuracy of time stamps. The problem
[ >is how to determine whether the input is coming from a live event or a
[ >sequencer.
[ >In the case of a sequencer, there's no way to know if the sequencer has
[ >already adjusted its timing to account for MIDI 1.0 delays.
[
[ If the timing spec is defined as the beginning of the packet, only
[ the receiver should be adjusting; so, if we are talking OS X to
[ OS X, this should work. If some other system had decided to sync
[ to the tail end of the packet, then there would be an inaccuracy.
I had moved beyond the question of time stamping on the head or tail of a
packet, but rather was trying to see if there was a way to get an even more
accurate timing indication of the real event, which occurs before the head of
the packet.
I should point out that my example situation is a human playing a MIDI
controller of some kind in sync with audio or MIDI that is being
played at the
same time the new MIDI input is recorded. In this case, it seems most
important to get the time stamp of the actual event, because even the head of
the MIDI 1.0 packet is delayed by 320 microseconds.
My assumption is that if the time stamp is for the head of the
packet, that the time stamp corresponds to the time the first byte
started transmitting, not the time it finished transmitting. You are
right, though, it could mean either. I guess we just came up with a
reason for poor Doug to edit the documentation again...
I was thinking that the driver would be adjusting the time stamp for
whatever the transmission delay was. Note that any MIDI driver
receiving a MIDI event cannot react to it until it has received the
entire event (because the buffers must contain complete events). It
could capture the time stamp after receiving the first byte and that
would be the most accurate if the sender couldn't run at full line
speeds.
[ >In some respects it
[ >may not be needed if the driver can "adjust" the time stamps on incoming
[ >data before the app sees the events, but it might still help to have this
[ >information around in the cases where it cannot be determined
without human
[ >help as to whether the incoming data is live or playback of a recorded
[ >performance.
[
[ I was assuming that the driver would adjust the time stamps on
[ input. What you have brought up is that it would be useful to
[ have that controlled by the user based on what was connected...
The sequencer example was just to show that we cannot always assume that MIDI
input is real time from some HID. A stand-alone sequencer (not Mac
OS X) might
shift its timing around by any amount to correct for transmission delays,
while a MIDI controller cannot predict human input and must always suffer a
transmission delay.
We are saying the same thing, I just didn't elaborate enough to be
clear that I agreed with you...
[ I was actually thinking the driver would add the a constant to
[ the MIDI device and the application would read it. What I thought
[ it would be used for was having the application deal with the case
[ where it wanted to schedule many events at the same time: Dumping
[ that chore on the driver isn't a good idea. The application is
[ the only thing that knows what can be safely slid around (for
[ example, the drums probably want to be the most accurate and the
[ driver has no idea which channels are drums).
You're talking about output now, so the solution could easily be different
than for input.
Yeah. Input is relatively easy. The really hard part (i.e.,
ordering events in the slow pipe) has been decided before the OS X
MIDI driver comes into the picture.
The issue you raise is what should the whole system (application and driver)
do to allow for "too many" events with the same time stamp which cannot
possibly all be transmitted simultaneously. Your suggestion is a step in the
right direction, because the driver cannot possibly know all on its own which
MIDI events have priority (e.g. drums). However, I don't like the
idea of the
application shifting the time stamps around, even if based on
information from
the driver, simply to communicate that certain notes should be favored for
perfect timing if not all notes can happen when scheduled. The
proper solution
for the issue you raise is for the application to communicate which MIDI
events have priority for perfect timing, and which notes can be shifted
depending on the hardware limitations. Messing around with time stamps to
communicate note priority could get very fuzzy, because the driver
would not be
able to distinguish between events which are shifted due to predicted traffic
jams and which events are not shifted.
Note, the MIDI driver may be merging output from several applications, so
having one application shift time stamps to keep the drum events on perfect
ticks would be thwarted when another application scheduled events at the same
time.
Perhaps the CoreMIDI API could be expanded to allow an application to specify
the priority of a MIDI event with respect to how accurate its timing
should be.
At first I thought of suggesting that either the first queued or the last
queued events should have priority (since this could simply be
documented, but
would not change the existing API), but the case with multiple MIDI
applications sending output to be merged by the driver breaks this kludge.
It seems that it might be helpful to have either a flag (allow shift, vs.
don't allow shift) or a valued parameter (e.g. higher priority should not be
shifted so long as there are lower priority events with the same
time stamp).
Perhaps there could even be a hint as to whether it is better for
the driver to
shift an individual event ahead or behind in time when there are traffic jams.
I suppose it is up to the major MIDI sequencing software developers to express
how important this sort of support in the API would be.
The output is much more difficult, because one can't be perfect and
if one tries to be, one is dumping a graph coloring algorithm into
each MIDI driver. Graph coloring algorithms are heuristics (which
means they won't all get the same answer) and complex (as in slow and
often buggy). It think it's more practical to get enough information
into the database so that applications can implement the graph
coloring algorithm if they want to, but to leave if for the
application. I don't think that streams getting merged from multiple
applications is very common, but even if it is, each application will
have spread the packets out in the desired order via time-stamps; so,
there is some info for prioritizing the merge. Definitely a
judgement call here...
--
-*****************************************
**
http://www.curbside-recording.com/ **
******************************************
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.