Re: Has nobody used CoreAudio Clock?
Re: Has nobody used CoreAudio Clock?
- Subject: Re: Has nobody used CoreAudio Clock?
- From: Brian Willoughby <email@hidden>
- Date: Tue, 03 May 2011 15:07:45 -0700
Hi Tom,
No, I'm not talking about MTC (MIDI Time Code). MTC is completely
independent of CoreMIDI's time stamping and MOTU's MTS. If you want
MTC, then you will have to generate it in your MIDI data stream just
like any other MIDI message. Whether you are using MIDI Clock or
MIDI Time Code, your performance will be improved by the timing
accuracy of CoreMIDI, provided there is a MTS or similar hardware
interface in use (otherwise, CoreMIDI does the best that it can with
a generic MIDI device).
Software synths are a different matter, because they combine the
timing difficulties of on-board audio and interface latency on top of
the MIDI timing issues. Software synths will always have a degree of
latency that is determined by the audio interface, and you'd probably
want to adjust for this in your MIDI sequencer playback engine for
the tracks that are targeted to internal software synths.
In contrast, you would not want to adjust the timing of external MIDI
synths because you want CoreMIDI to deliver the data on time. The
only exception here would be if you allow your user to manually enter
a latency value for an external synth - basically a time shift of a
track under user control.
By the way, 10 ms is the pre-loading time. The accuracy of CoreMIDI
and MTS is a fraction of 1 ms, better even than USB is capable of
otherwise. Classic MIDI has a timing resolution of 32 ns (0.032 ms),
so that would be the theoretical lower limit of CoreMIDI's timing
accuracy unless you're working with an interface that goes beyond
MIDI and USB-MIDI. The latency of classic MIDI is higher than USB,
of course, because of the lower bandwidth, but the asynchronous
nature of classic MIDI allows it to greatly outperform the 1 ms
timing jitter of USB-MIDI.
To recap:
Classic MIDI has 32 ns timing accuracy, but any significant amount of
traffic potentially creates a bottleneck where the throughput causes
delays. Thus, classic MIDI has data-dependent jitter. The advanced
timing features of CoreMIDI allows highly accurate time stamps, and
if a MIDI interface can divide the traffic among multiple output
ports such that there aren't any bottlenecks, then that 32 ns timing
accuracy is available because the final output of any MIDI interface
is classic MIDI.
USB-MIDI has approximately 1 ms timing accuracy, which amounts to a
lot of jitter. There isn't even a strict guarantee that you can rely
upon that 1 ms jitter, because the system load might cause a USB-MIDI
message delivery to be delayed by any number of 1 ms frames. The
amount of data, for all practical purposes, has no effect on this
jitter, so you're left with a totally random jitter or a system-load-
dependent jitter.
Advanced USB devices which have a non-USB-MIDI implementation
(possibly as an optional addition to the USB-MIDI standard) gain the
advantages of both classic MIDI and USB, at the cost of requiring a
custom CoreMIDI driver. The hardware device and CoreMIDI keep
themselves synchronized via the driver to maintain a highly accurate
clock. There are no bottlenecks since Full Speed USB is much faster
than classic MIDI. Even the random delays in USB can be circumvented
by delivering the data severals frames in advance, enough to
guarantee that system load will never delay the data beyond its "due"
date. The hardware then maintains a small buffer of MIDI data that
is held in waiting until the precise moment that it should be
transmitted. Since the classic MIDI output is capable of 32 ns
timing, and the hardware device itself is independent of your main
CPU, this allows very precise timing - since the MIDI hardware isn't
really doing anything but real-time MIDI.
P.S. Edirol also makes (made?) a USB MIDI interface with a switch to
choose between the USB-MIDI standard and a custom protocol with
highly-accurate timing that would make CoreMIDI happy. You can't
change the switch on the fly - you have to set it before you plug in
the USB connector - but otherwise it's a great option.
On May 3, 2011, at 14:43, Tom Jeffries wrote:
Brian, I assume you are talking about using MTC? Or does the
MIDIPacket architecture have that level of accuracy? We also want
to work with software synths. How is the clock accuracy when the
synth is running on the computer rather than being external?
10 ms would be excellent.
On Tue, May 3, 2011 at 2:32 PM, Brian Willoughby
<email@hidden> wrote:
On May 3, 2011, at 13:38, Tom Jeffries wrote:
It's very simple- I have a MIDI sequencer that is playing data
that is updated in real time. I want to use a clock to check to
see what has been added, and then send the MIDI data out. I can
probably get reasonable accuracy using MIDIPackets, but it seems
more direct to take control of the playback by use of an accurate
clock.
Maybe that's not an option on OS X? I've done this on other
operating systems, I'll be a little surprised if OS X doesn't
allow it.
Your research has overlooked a very powerful feature of CoreMIDI:
The ability to queue outgoing events in advance of when they
should be transmitted, such that the system and/or hardware can
handle the timing more precisely than anything you could hope to
do as a new OSX programmer.
Basically, by insisting on handling the timing yourself, the
accuracy will only be as good as you can manage in your code.
There are options for hardware timing that would not be available
to you unless you use the CoreMIDI time stamping.
What is particularly important to take note of is that some MIDI
interfaces have their own clocking. CoreMIDI is able, via the
proper CoreMIDI driver, to synchronize with the MIDI hardware
clock to deliver MIDI data in advance of when it should be
transmitted. Under such conditions, the MIDI hardware holds on to
this advance data and then uses its own clock to control the
timing of data transmission. This is much more precise than
anything you could do at the software level, especially
considering the latency of the link between the main CPU and the
actual MIDI hardware. Thus, even if you were an advanced OSX
kernel programmer, you would not have access to the same degree of
timing accuracy as the internals of the MIDI interface itself.
MIDI Interfaces such as the Emagic AMT-8 / Unitor 8 and various
MOTU interfaces with their proprietary MTS (MIDI Time Stamping)
feature support this hyper-accurate timing capability of CoreMIDI.
Unfortunately, USB-MIDI is an incredibly flawed design that was
created without the cooperation of the MIDI Manufacturers
Association. As a result, USB-MIDI has absolutely no control over
timing or latency. It's "good enough" for the price point, but
not necessarily good enough for music. This is a problem because
most people buy a generic USB-MIDI interface, probably because of
the price and because a custom driver is not needed.
You might think that your code doesn't have to be better than 99%
of the MIDI interfaces out there, but why go to great lengths to
device your own timing mechanisms when you can use CoreMIDI? As
more people learn about MIDI interfaces with advanced hardware
timing, your application will automatically take advantage of the
improvements, but without any special-cased code getting in the
way of normal operation with inferior USB-MIDI interfaces.
Then, the challenge becomes managing your recorded or generated
MIDI output stream such that you can deliver data slightly ahead
of the playback position. I'm not even sure what Apple
recommends, but I'd say that at least 10 ms in advance would
probably be a good idea. Your MIDI sequencer should thus be
designed to always been looking slightly ahead, so that CoreMIDI
will always have a chance to download the MIDI data stream to the
MIDI interface in time for ultra-precise timing.
Brian Willoughby
Sound Consulting
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden