• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Questions on data in MIDIPackets
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Questions on data in MIDIPackets


  • Subject: Re: Questions on data in MIDIPackets
  • From: Herbie Robinson <email@hidden>
  • Date: Sun, 24 Feb 2002 17:57:12 -0500

Another thing to consider. Sound travels a foot a msec. So consider an
orchestra which is what, about a 20ft+ radius - so for the conductor he has
to deal with at least a 20msec latency (which is why every player in an
orchestra adapts and why they always sound so "fat")

Then another one... A drummer hits the skin of a drum - his/her ear is 2-3ft
away from the skin (there's a 2-3msec "latency" for the sound to travel from
the skin to his/her ear) - and probably 10msec to get to the rest of the
band. The horrible latency that drummers always complain about with MIDI is
probably more to do with the reaction time of the instrument getting to
actually SEND the MIDI message than the msec of transmission time of the
MIDI message.

From all of this the main "musical" problem seems to me not so much one of
latency, but one of consistency and predictability, ie. jitter, at least in
performance.

In X we don't guess what the transmission time is, but we time stamp the
message as early as we can on input, and send it as close as we can on
output. We also use the same time mechanism for both the audio and MIDI time
stamps (UpTime or HostTime) - so I think the mechanisms are there to find
out about what is going on. The HAL will also report what the driver says
are any of its known latencies, so you can tell (if the driver is correct)
when a particular sample will cross over from A-D or D-A.

(I know some of this isn't relevant and you can all start yelling at me
now!)

Bill

It's not off topic. The effect of latency and jitter in that latency is very definitely part of the requirements spec for what we are doing.

I should point out that what you are describing above are all natural phenomena. The catch is that MIDI is used in some decidedly unnatural ways. One that immediately comes to mind is layering multiple synths with the same part. If the synths are producing similar, but not quite the same waveforms (multiple kick drum samples, for example), then minute changes in the firing times will affect the timbre of the resulting sound. In this case, sub-msec jitter accuracy is needed to achieve repeatable results.

Also, I don't remember the reference, but supposedly experiments have proven that some musicians can distinguish relative event times on the order ot 1/2 msec.

If two sounds are heavily panned, very short relative time discrepancies show up a motion or wavering in the imaging. I don't remember the times involved.
--
-*****************************************
** http://www.curbside-recording.com/ **
******************************************
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
References: 
 >Re: Questions on data in MIDIPackets (From: Bill Stewart <email@hidden>)

  • Prev by Date: Re: Questions on data in MIDIPackets
  • Next by Date: Re: Questions on data in MIDIPackets
  • Previous by thread: Re: Questions on data in MIDIPackets
  • Next by thread: Re: Questions on data in MIDIPackets
  • Index(es):
    • Date
    • Thread