From The Linux Foundation
Revision as of 20:19, 9 June 2008 by Oedipus (Talk | contribs)

(diff) ←Older revision | view current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search

Minutes of the Open A11y Expert Handlers SIG Call 2008/05/19


  • Neil Soiffer (NS/chair)

Agenda Review

Approval of Last Meeting's Minutes

Minutes of Expert Handlers Conference Call 2008/05/19

Note: These minutres have not yet been finalized.

No one took notes during the meeting, but here is what was discussed (from NS):

  • What Mozilla does for XForms (builds an accessibility tree (atree) based on the content)
  • VB described what ViewPlus does to make SVG accessible. ViewPlus makes SVG accessible by building a parallel tree of how to speak the SVG. Each node of the parallel tree has an ID. Only some of the SVG nodes are augmented with an attribute that points to IDs in the parallel tree; the attribute can point to many such IDs. Eg, in a human body, the SVG representing a blood vessel, by point to the parallel tree description of the hand that the blood vessel lies in, the circulatory system description, etc.
  • NS suggested that perhaps an application would call upon an Expert Handler (EH) to build the accessibility tree for the expert ML and then AT wouldn't need to deal with an EH.

The group explored this idea at some length and these important points were raised:

  • The EH could be built-in or separate with an interface to calling it that we define
  • The atree represents the default navigation order.
  • The atree needs to be enhanced to handle braille and "enhanced speech" (text with speech cues as per SSML).
  • Somehow, information needs to be communicated to the EH (perhaps through IA2 or some other interface) how many braille cells to use for braille generation.
  • Math has more than 20 different braille codes, so the desired braille code needs to get communicated to the EH, which can return a success or fail code if it supports it.
  • We need to find out if other expert languages require other info passed in to support either braille or speech generation.
    NS note while writing minutes: they are different ways to speak math, so perhaps some sort of user choice needs to passed also.

PB clarified the rule of "name", "value", and "description" in MSAA.

  • MSAA get_accName is for what the AT should speak
  • MSAA get_accDescription is for a longer description of the object if needed
  • IA2 IAText::text is for the actual text as seen on the screen

NS mentioned that MathPlayer sticks enhanced speech in the description field, but that is non-standard and no AT uses it.

PB (post meeting note): We talked about how IA2 might be enhanced with a new interface, e.g. IATextForDevice with methods like

stringForDevice([in] enum device, [in] long bufferSize, [out] BSTR deviceString)


  • device indicates the device requirements, e.g. SSML or SAPI4 or one of n kinds of Braille devices,
  • bufferSize would be something like -1 for TTS and something like 40 or 80 for Braille, and
  • deviceString would be the EH's markup for the requested device.

The AT would first try one or more calls to IATextForDevice (more than one if there was priority scheme where one type of code was preferred if available but a secondary code was acceptable as backup, e.g. the AT user my have specified they prefer Australian Braille vs UK Braille). If stringForDevice failed, then MSAA's get_accName could be used as a fallback.