Re: Questions on data in MIDIPackets
Re: Questions on data in MIDIPackets
- Subject: Re: Questions on data in MIDIPackets
- From: Bill Stewart <email@hidden>
- Date: Sun, 24 Feb 2002 14:20:32 -0800
on 23/2/02 1:27 AM, Kurt Bigler wrote:
>
on 2/21/02 4:57 PM, Brian Willoughby <email@hidden> wrote:
>
> The only thing I wonder is whether percussion timings might benefit from some
>
> sort of priority flagging which might allow a group of events to be shifted
>
> in
>
> time (but not reordered) during heavy traffic jams such that important events
>
> happen "on the beat" as closely as possible. But this does seem to be more
>
> in
>
> the realm of an application instead of the driver...
>
>
I'm not sure what you meant by "percussion timings", but if you meant to
>
exclude keyboard note-on/off events from this category, please don't.
>
Keyboards need the same timing accuracy as other percussion instruments. (As
>
Keith Jarrett says, the piano is just 88 drums.)
Can't resist commenting on this.
Piano's have, I believe, about a 50msec latency from the first touch on the
key to when the sound is heard by the player. I've actually heard some
comments from pianists using digital emulations of pianos that they find
these instruments TOO reactive and would prefer some introduced latency in
the response so it feels more like a real piano.
Another thing to consider. Sound travels a foot a msec. So consider an
orchestra which is what, about a 20ft+ radius - so for the conductor he has
to deal with at least a 20msec latency (which is why every player in an
orchestra adapts and why they always sound so "fat")
Then another one... A drummer hits the skin of a drum - his/her ear is 2-3ft
away from the skin (there's a 2-3msec "latency" for the sound to travel from
the skin to his/her ear) - and probably 10msec to get to the rest of the
band. The horrible latency that drummers always complain about with MIDI is
probably more to do with the reaction time of the instrument getting to
actually SEND the MIDI message than the msec of transmission time of the
MIDI message.
From all of this the main "musical" problem seems to me not so much one of
latency, but one of consistency and predictability, ie. jitter, at least in
performance.
In X we don't guess what the transmission time is, but we time stamp the
message as early as we can on input, and send it as close as we can on
output. We also use the same time mechanism for both the audio and MIDI time
stamps (UpTime or HostTime) - so I think the mechanisms are there to find
out about what is going on. The HAL will also report what the driver says
are any of its known latencies, so you can tell (if the driver is correct)
when a particular sample will cross over from A-D or D-A.
(I know some of this isn't relevant and you can all start yelling at me
now!)
Bill
>
>
on 2/21/02 4:42 PM, Brian Willoughby <email@hidden> wrote:
>
> longer bits streams such as USB will probably always receive
>
> entire packets with a single time stamp.
>
>
Lacking knowledge of details here, it sounds like multiple MIDI messages get
>
packaged with a single time stamp. This sounds unfortunate, unless this
>
clumping could be done intelligently (e.g. only if the message time
>
difference is less than the desired threshold of accuracy). Certainly
>
time-stamp per MIDI message from source to destination would be the
>
ultimate. No doubt future interfaces will go in that direction unless it
>
turns out not to matter.
>
>
In the mean time designing to allow for timing corrections sounds good, but
>
it may be hard to anticipate all the consequences, and also in what ways
>
future MIDI interfaces will be different.
>
>
So I hope the Apple design is flexible enough to allow this to evolve, e.g.
>
by providing for options throughout the MIDI "stack" which have best-guess
>
defaults, but can be overridden at any level based on information available
>
to that level, e.g. by the applicaton because the user has told the
>
application something about the connected devices or how they will be used
>
that the drivers couldn't have guessed.
>
>
The ultimate control belongs with the user. If an application overrides a
>
driver default, without allowing the user to override that, it is not a good
>
thing.
>
>
And if multi-application scenarios are important as suggested by:
>
>
> To get to my point, if a couple of applications are running off the same
>
> timing sync source, then it actually becomes quite likely that multiple
>
> clients
>
> will schedule events for the same time stamp (e.g. on the beat).
>
>
then system-level configuration may be particularly important, lest the user
>
have to set compatible options separately in several applications via
>
different interfaces (yuck).
>
>
To be very specific: If the user can choose whether and by what heuristic
>
various drivers may attempt timing corrections (etc.), that might avoid
>
unforseen trouble that might result from too much driver "intelligence". If
>
the user can intervene, the driver can be less conservative and risk more
>
"intelligent" defaults.
>
>
-Kurt Bigler
>
_______________________________________________
>
coreaudio-api mailing list | email@hidden
>
Help/Unsubscribe/Archives:
>
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
>
Do not post admin requests to the list. They will be ignored.
mailto:email@hidden
tel: +1 408 974 4056
__________________________________________________________________________
"Thousands of years ago, cats were worshipped as gods. We have never
forgotten this."
__________________________________________________________________________
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.