Re: audio-specific UI component libraries for Cocoa?
Re: audio-specific UI component libraries for Cocoa?
- Subject: Re: audio-specific UI component libraries for Cocoa?
- From: Brian Willoughby <email@hidden>
- Date: Mon, 1 Nov 2010 12:19:19 -0700
On Nov 1, 2010, at 09:29, Stephen Blinkhorn wrote:
On 1 Nov 2010, at 05:15, Paul Davis wrote:
On Mon, Nov 1, 2010 at 12:50 AM, Roman Thilenius
<email@hidden> wrote:
that is what i would do, too; everything from a GUI goes to 0. 1.
and then
you can map or distort stuff elsewhere.
that's a lovely goal. if you can everything done with that plan, then
i congratulate you.
it gets a bit more complex when you write applications in which there
are multiple points of control, some of them not even GUIs. but that
gets back to the Cocoa "confusion" about MVC programming, in which
the
traditional view of the role of the Model, the View and the
Controller
are subverted somewhat. if you're comfortable with the Cocoa
definition of these things, then you'll probably consider the
approach
to solving the multiple points of control problem very
straightforward. if you prefer the "original" definition of MVC, then
it gets a bit more complex.
either way, you end up with multiple "controllers", each of which can
potentially require their own separate "control view" to adjust how
the controller itself maps the canonical range to a model-centric
value. the classic case of this would be a gain fader, which the user
might choose to operate with a different range than the initial one.
now add in the existence of a MIDI or OSC hardware control, and
things
get a bit more complicated - are they both operating on the same
controller? are they both views or both controls or both view+control
or something else?
I have an intermediate step if I understand you correctly. So for
a GUI I go from GUI >> GUI Controller >> Parameter Specification
Map >> Model. The parameter specification map expects input in the
0..1 range to produce a model value and can also unmap the other
way from a model value to the 0..1 range. At the same time it can
produce a textual representation (db, KHz etc..) for feedback in
the GUI or automation editors etc. I use the same class in the GUI
and AU.
So from a MIDI controller I'd expect to either scale the MIDI
controller value to 0..1 first or offer an alternative method in
the map that does so automatically.
Does nobody use the full AudioUnit standard of realistic values? Is
everyone coding to the lowest common denominator of VST?
Seems like the example of a gain fader where the user has selected a
different range would work easily if the AU parameter were marked as
"dB" and every control worked with the actual dB value as a float.
There'd be no confusion about what 0.0 and 1.0 mean in this case.
Brian Willoughby
Sound Consulting
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden