Re: Making VoiceOver see changes
Re: Making VoiceOver see changes
- Subject: Re: Making VoiceOver see changes
- From: Håkan Waara <email@hidden>
- Date: Thu, 9 Nov 2006 00:44:00 +0100
9 nov 2006 kl. 00.13 skrev Ricky Sharp:
On Nov 8, 2006, at 2:23 PM, Michael Ash wrote:
I'm adding Accessibility and VoiceOver support to a custom
control. I've done well with it so far, except for getting
VoiceOver to see changes made to the control. This particular
control has UI elements inside a larger view, and they can be
moved horizontally using the arrow keys. My trouble is that VO
does not re-read the value of the element, nor does it move the
focus box to track it.
I've tried to tell VO that my element is moving around and
changing, using the following code and many variations on same:
NSAccessibilityPostNotification( self,
NSAccessibilityResizedNotification );
NSAccessibilityPostNotification( self,
NSAccessibilityMovedNotification );
NSAccessibilityPostNotification( self,
NSAccessibilityValueChangedNotification );
I've verified that the currently focused accessibility element is
self and that this code is being executed. But no matter what I
try, VO does absolutely nothing. If I manually move the focus to
another element and then back, VO reflects the updated position
and value.
It's obvious that this is possible. NSSlider, for example, makes
VO re-read its value when changing it using the arrow keys. But I
don't know what the secret sauce is to make it happen.
I found a post on this list with basically the same problem here:
http://lists.apple.com/archives/accessibility-dev/2006/Jul/
msg00030.html
Unfortunately I didn't find an answer, either to that post or
anywhere else I could think to look.
That was my post, and yes, there's currently no answer at all.
My guess is that NSSlider (and others?) are using some private API
to do their work. I will at least be filing enhancements soon to
ensure developers are given the proper hooks.
There's also other things that VoiceOver does that I've found no
mechanism to tap into for my custom controls. I'll file
enhancements on that as well.
The informal NSAccessibility protocol is *too* informal; you can
choose to implement (or not impement) almost anything you want, and
there's a big void of guidelines that tell you what you need in order
to make things work in the real world.
For example, one very useful thing would be if attributes/
notifications that are somehow crucial to make VoiceOver work were
flagged as such.
In general, I think most developers would appreciate if VoiceOver's
internal mechanisms were documented, since that's the primary
accessibility application on OS X to target today.
For example, explicitly: when does it speak something? How does it
deal with broken parent/child hierarchies? Which attributes does it
support?
One way to do this without having to write even a line of comments
would be open sourcing VoiceOver itself, putting it under ASPL. I can
however understand if this will never happen...
In order to make custom controls work correctly with VoiceOver,
having to do a lot of reverse-engineering is the rule rather than the
exception, and I believe OS X accessibility is held back by this fact.
/Håkan _______________________________________________
Do not post admin requests to the list. They will be ignored.
Accessibility-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden