The Linux Foundation

 
Accessibility/Handlers/Meetings/Minutes20080922.html

From The Linux Foundation

Minutes of the Open A11y Expert Handlers SIG Call 2008/09/22

Attendance

  • Neil Soiffer (NS/chair/scribe)
  • Pete Brunet (PB)
  • Vladmir Bulatov (VB)
  • Janina Sajka (JS)



Minutes of Expert Handlers Conference Call 2008/09/22

The call began with an apology from NS for not getting minutes done for last week's meeting with Aaron. There was no recording and NS's notes were too bad to use. Kudos to Gregory's ability to speak and take notes at the same time.

There was summary of what Aaron suggested. He suggested using ARIA roles to indicate what goes into the accessibility DOM. The roles might be added by authoring tools or by an EH (not clear on what he suggested).

NS: When is the Accessibility DOM built? Is there a notification if the DOM changes, or if an EH wants to run?

No one knew and it was agreed that have Aaron join us again next week would be good.

NS: Even if there are roles added, how do you get the appropriate speech text or braille out of the application?

NS: What's the best way to expose the accessible markup (speech text with speech cues, braille text). How do you get those values? How does AT know about the speech cues? I think we should extend IA2 is some way to allow additional access methods.

VB: Graphics doesn't act like a tree. Walking a tree doesn't make sense for graphics. For graphics, a graphics EH should just take over.

PB: Walking tree is not necessary. If someone touches something, it fires a focus event and then you access what the role, name, etc., of the node that fired the event. MSAA can find a node based on the point on a screen coordinates?

PB: What gets sent for the Braille?

NS: Use Unicode. There is a 256 block of chars representing the possible 8 dot patterns.

PB: Does this mean the EH needs to generate braille for each (spoken) language.

NS: Yes, the EH generates the dot pattern in Unicode needed. Neither the browser nor the AT needs to do anything other than the AT mapping the dot pattern to the specific braille display.

PB: is there a standard for speech cues?

JS: yes, SSML, SAPI, ...

PB: Does IA2 just need to add a field for braille and another for text with speech cues and pick a standard like SSML? AT would check those first, and if they were empty, would use the name field.

NS: That's what I thought would be the way to go a few years back. There still needs to be a way to connect the EH to the node. That's what Aaron could help with. It already exists in IE.

ACTION ITEM: Invite Aaron Leventhal to the next call and delay Glen until after we understand the browser better. Potentially, the time of the call might need to change to accommodate Aaron. Janina and Pete can't make 8-10 CDT on Monday.







[Article] [Discussion] [View source] [History]