Re: Questions on data in MIDIPackets
Re: Questions on data in MIDIPackets
- Subject: Re: Questions on data in MIDIPackets
- From: Andre Lipinski <email@hidden>
- Date: Sun, 24 Feb 2002 17:56:06 -0500
On Sunday, February 24, 2002, at 05:20 PM, Bill Stewart wrote:
on 23/2/02 1:27 AM, Kurt Bigler wrote:
on 2/21/02 4:57 PM, Brian Willoughby <email@hidden> wrote:
The only thing I wonder is whether percussion timings might benefit
from some
sort of priority flagging which might allow a group of events to be
shifted
in
time (but not reordered) during heavy traffic jams such that important
events
happen "on the beat" as closely as possible. But this does seem to be
more
in
the realm of an application instead of the driver...
I'm not sure what you meant by "percussion timings", but if you meant to
exclude keyboard note-on/off events from this category, please don't.
Keyboards need the same timing accuracy as other percussion instruments.
(As
Keith Jarrett says, the piano is just 88 drums.)
Can't resist commenting on this.
Piano's have, I believe, about a 50msec latency from the first touch on
the
key to when the sound is heard by the player. I've actually heard some
comments from pianists using digital emulations of pianos that they find
these instruments TOO reactive and would prefer some introduced latency in
the response so it feels more like a real piano.
Another thing to consider. Sound travels a foot a msec. So consider an
orchestra which is what, about a 20ft+ radius - so for the conductor he
has
to deal with at least a 20msec latency (which is why every player in an
orchestra adapts and why they always sound so "fat")
Then another one... A drummer hits the skin of a drum - his/her ear is
2-3ft
away from the skin (there's a 2-3msec "latency" for the sound to travel
from
the skin to his/her ear) - and probably 10msec to get to the rest of the
band. The horrible latency that drummers always complain about with MIDI
is
probably more to do with the reaction time of the instrument getting to
actually SEND the MIDI message than the msec of transmission time of the
MIDI message.
From all of this the main "musical" problem seems to me not so much one of
latency, but one of consistency and predictability, ie. jitter, at least
in
performance.
In X we don't guess what the transmission time is, but we time stamp the
message as early as we can on input, and send it as close as we can on
output. We also use the same time mechanism for both the audio and MIDI
time
stamps (UpTime or HostTime) - so I think the mechanisms are there to find
out about what is going on. The HAL will also report what the driver says
are any of its known latencies, so you can tell (if the driver is correct)
when a particular sample will cross over from A-D or D-A.
(I know some of this isn't relevant and you can all start yelling at me
now!)
Bill
Finally! I've had enough of this thread.
Andre.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.