From The Linux Foundation
Revision as of 00:49, 23 August 2007 by Oedipus (Talk | contribs)

(diff) ←Older revision | view current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search

Open A11y Expert Handlers Committee Conference Call Minutes 2007/08/20


  • Neil Soiffer (NS) - chair and scribe
  • Janina Sajka (JS)
  • Pete Brunet (PB)
  • Vladimir Bulatov (VB)
  • Gregory Rosmaita (GR)
  • Shawn Djernes (SD) (Nebraska Commission for the Blind)


We began with introductions -- Shawn is a new person that has joined.

We also had some discussion about the web site.

ACTION ITEM: Gregory to effect changes to Handlers sub-site discussed on call and proposed in his Handlers' web site update post

Neil didn't do last week's action item on the mission statement, so he took it on again.

ACTION ITEM: Neil to draft version 4 of Mission Statement (new as of 20 August 2007)

Gregory has taken over Navigability from Pete. He recapped last week's discussion.

Neil said that not much has been done in the way of studies on math navigation. The only study on math navigation is a postscript file. Google has a cached version of the math navigation study that might be more accessible. Warning: it is long.

JS: perhaps we could get NSF funding to research navigation.

GR: <fill in names, details about research on navigation>

We went over the resources that GR added to last week's minutes. These include links to Raman's work and FireVox and some MathML links.

Structure to Speech Use Cases

The main point is that speech is temporal -- braille and on-screen is static. Hence, there are different strategies to navigate a page when using speech. Being able to shut the speech up is important.

Navigation via tabs is important. That seems to simulate a sighted person's ability to rapidly skip to the important parts of the page.

JS: Perhaps a way to speak certain symbols (eg, an integral sign)would be an important common function.

SD: We have this problem a lot.

NS: Perhaps this is our simplest handler -- something that knows how to speak Unicode symbols.

GR: the way Charles Chen (FireVox) does this is via CSS3 phoneme.

NS: this doesn't help for non-web apps (eg, OpenOffice).

GR: this might be useful for speech input by reversing the table

NS: MathTalk is a system for speech-to-math and they have to use "alpha" for "a", etc, so it wouldn't be a direct reversal.

JS: maybe it should just be used for the non-ASCII parts, or more generally, for the parts that they don't directly know about.

SD: Dragon works well on everything except math. I had to customize it for math.

NS: You need to worry about languages. Thankfully, web pages are supposed to have language tags.

JS: OS also know your current language.

NS: But the document may be in a different language. Also, pronunciation might depend on the region (eg, "full stop" vs "period").

JS: And document might be in multiple languages.

Next Handlers Meeting: 27 August 2007