Accessibility and the MacBook Pro Touch Bar
Accessibility and the MacBook Pro Touch Bar
- Subject: Accessibility and the MacBook Pro Touch Bar
- From: Bill Cheeseman <email@hidden>
- Date: Fri, 14 Apr 2017 07:38:11 -0400
Can anybody provide tips or links about using the Accessibility API and Quartz Event Taps to monitor and control the 2016 MacBook Pro Touch Bar while it is being used to control another application?
I am aware that Apple's current Touch Bar documentation claims there is no API to monitor the Touch Bar because applications don't need to know when the Touch Bar is being used instead of the keyboard or mouse. But that isn't true for assistive applications, as Apple's own VoiceOver demonstrates. An assistive application sometimes needs to know when a user does something using the Touch Bar.
Also, it would sometimes be useful to intercept and even discard an event generated from the Touch Bar using Quartz Event Taps, just as one can do with keyboard, screen, tablet and scroll wheel events. In fact, if one can't do this with Touch Bar events, it destroys the usefulness of being able to do it with other hardware input events. The Touch Bar is an alternative input source for the user, and an assistive application can't currently determine whether the Touch Bar is what the user used.
I assume VoiceOver is using private API. It needs to be made public. Can anybody point me to information about it?
I have not yet explored the use of gesture recognizers in the context of an assistive application. Since the Touch Bar uses gesture recognizers, perhaps that is a way to approach the issue. Can anybody point me to information about gesture recognizers generally in the context of assistive applications? |
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Accessibility-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden