Re: Calling TellListener & AUParameterSet in audio thread
Re: Calling TellListener & AUParameterSet in audio thread
- Subject: Re: Calling TellListener & AUParameterSet in audio thread
- From: Marc Poirier <email@hidden>
- Date: Fri, 21 Mar 2003 05:53:28 +0100 (CET)
>
Let's step back a moment ... the purpose of TellListener is to inform
>
the host of a UI gesture in a plug-in's view. Currently the only
>
gestures that are defined are mouse-down and mouse-up.
>
>
The purpose of this functionality is for recording automation gestures;
>
looking at the pro consoles with flying faders, and software that uses
>
the same paradigm, a mouse-down (touching a fader) does not necessarily
>
change its value right away (not until the fader is moved), and yet the
>
time at which the fader was touched may have a lot of significance,
>
i.e. stop playing back the existing automation for the relevant
>
parameter, and prepare to replace it with new automation if the fader
>
*is* moved.
This is a kind of mixed up mish-mash of GUI and DSP things here. I
believe that that's where Raphael's concerns are coming from. I fully
understand that the purpose of recording a "gesture" is to allow for
proper "touch"-style automation (and any automation that requires the
concept of a gesture), and that's exactly why it doesn't make sense to
split that capability off exclusively into the GUI domain. It should be
possible for a plugin to begin and end automation gestures for any reason,
at any time, with or without any GUI interaction.
In addition, it should be possible for a plugin to produce a single value
change that can be recorded as automation, not necessarily a gesture.
>
MIDI doesn't really have this concept, at least not with single
>
controls. You don't know that a MIDI controller was touched until you
>
get a new value from it.
>
>
Now, theoretically we could have high-class MIDI controllers (or do
>
some already exist ... ?) that send two different control-change
>
messages, one on/off to indicate when the user touches/releases the
>
fader, and another continuous value to reflect when the control is
>
actually moved. In this case the view could indeed be receiving MIDI
>
messages it wanted to forward to the host for "control-touched" and
>
"control-released" (the more properly general way to describe
>
"control-mouse-down" and "control-mouse-up"). And then we would indeed
>
want to pay attention to the thread context of these messages.
MIDI does not have this concept built into its protocol, but it's not
unusual for MIDI applications to use a time-out method of recognizing
gestures. In other words, if consecutive MIDI events occur within a
certain duration, then they will be considered to be part of a single
gesture. Once the time-out duration elapses after some event (without any
new events occuring), then the gesture ends.
Marc
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.