Re: Questions on data in MIDIPackets
Re: Questions on data in MIDIPackets
- Subject: Re: Questions on data in MIDIPackets
- From: Kurt Bigler <email@hidden>
- Date: Mon, 25 Feb 2002 13:12:17 -0800
on 2/25/02 9:58 AM, Bill Stewart <email@hidden> wrote:
>
on 24/2/02 3:54 PM, Kurt Bigler wrote:
>
> The current OS X
>
> APIs still treat the midi "protocol" as "data" rather than logical events
>
> (or did I miss something?), which puts us in a position of being not so well
>
> prepared for note numbers > 127, floating-point key velocities ...
>
The MIDI Services on X are just that - getting MIDI data in and out of the
>
system to MIDI Devices (and some IA facilities). How could we possibly
>
design and implement an API for a protocol that doesn't exist yet and has no
>
real hardware support?
My thought was that by providing an alternative to MIDI Services that uses
the abstractions you describe in your next paragraph...
>
We also know that MIDI is not the be all and end all of these kinds of
>
protocols (particularly for Software based rendering) so we specifically
>
provided APIs for the MusicDevice component (AudioUnit/MusicDevice.h) that
>
allows for:
>
Floating Pt note numbers
>
Floating Pt parameter lists to new notes of arbitrary lengths
>
Arbitrary numbers of groups (ie. Think MIDI Channels) - where also groups
>
are NOT defined to be notes on a particular instrument, but are user defined
>
Floating Pt parameter values
>
Sample offsets for all of these.
>
(As well as MIDI APIs)
...instead of raw midi data, then you would take a step in the right
direction. You obviously couldn't support ALL of an unknown future
protocol, but we can already anticipate SOME of the features that a
midi-replacement protocol would provide, and many of these are sorely needed
right now. By anticipating that functionality (e.g. more accurate
parameters) and building a protocol that incorporates it IN ANY FORM, you
would be providing a protocol that both MIDI and a future protocol could be
translated to, transparent to the application that uses the enhanced
protocol.
This might mean that future devices using a new protocol would only have a
fraction of their features available to existing applications, but that
would be larger than the MIDI-only fraction. No doubt these manufacturers
would provide a MIDI-compatible driver for their new enhanced hardware
allowing it to be used _immediately_ in a degraded way with old
applications. An intermediate protocol anticipating future functionality
would mean that users would have _immediate_ access to better-than-MIDI
functionality as soon as the new hardware and the OS support are released,
without having to wait for new applications.
This immediate availability of new features would motivate the equipment
manufacturers, because they could see more sales months earlier. It would
free the manufacturer to take advantage of a platform that already exists.
An enhanced intermediate protocol if designed well could be a helpful step
in defining the future midi-replacement. Of course it would weaken it if
not designed well. The old defacto-standard problem.
Maybe for reasons I am unaware, this is not a worthy goal. But to look at
it another way, I don't see any reason to perpetuate the current limitations
of MIDI in any application written today. Current hardware is MIDI.
Currennt software does not need to stay there a moment longer. Let the OS
translate MIDI into something more forward looking right now. Then the way
is immediately open for manufacturers to take the next steps.
I am just suggesting to do the "obvious things". Maybe this is too much
work and for political reasons too many people would need to be in a
committee to discuss this, such that you might as well wait for somebody
else to finish a new standard document first. But apple has been bold
before, so why not take a step that yields obvious advantages and call it a
day?
>
> The current APIs do not permit applications to be developed now which will
>
> support future changes in the underlining technologies. Perhaps it seems
>
> too hard to anticipate a future standard to make it seem worth pursuing this
>
> now. Nonetheless I believe many of the next steps (previous paragraph) are
>
> obvious. So why not act not and build a software API that _leads_ hardware
>
> development?
>
>
I might pose the counter question....
>
>
When are we going to see tools (sequencers and such) for developers that
>
whilst allowing for MIDI based instruments, actually provide the user with
>
far more flexibility and power. There's a lot of this kind of stuff around -
>
Csound and the like - but it isn't very user friendly (to understate it!)
>
and doesn't work well with existing work practises of musicians and
>
engineers.
Yes, well you can get into a whole other world here where there aren't even
notes anymore. You can't help that arena yet. I am not talking about
getting away from notes, or about continuous real-time updating of
frequencies or spectrum information.
The point is we already have keyboards with notes, we use them, and we find
the MIDI protocol to be lacking in obvious ways that do NOT require going
beyond the realm of notes. The existing medium/metaphor does not live up to
its own implied expectations. This can be addressed without getting into
other realms.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.