- 1 Minutes of the Open A11y Expert Handlers SIG Call 2007/12/17
- 1.1 Attendance
- 1.2 Minutes of Expert Handlers Conference Call, 17 December 2007
- 1.3 Identify Items for the 6 January 2008 Conference Call
Minutes of the Open A11y Expert Handlers SIG Call 2007/12/17
- Neil Soiffer (NS/chair)
- Gregory J. Rosmaita (GJR/scribe)
- Janina Sajka (JS)
- regrets: Pete Brunet
New Agenda Items
- who/what is the Accessibility Interoperability Alliance (AIA) and how can we work with them, enter into a dialogue with them, and ensure harmonization of efforts?
- AIA Launches (Press Release)
- The following 2 blockquotes are excerpted from the AIA web site:
The AIA is run by a three-member Steering Committee, which consists of one IT member, one AT member, and one At-Large member as Chair of the group. At the formation of the AIA, the initial steering committee was selected. After the intial term expires on January 31, 2010, the IT and AT steering committee members will be elected by their respective industry sectors; all members will participate in the election of the At-Large member. Steering Committee members serve two-year terms.
- IT Member: Adobe - Andrew Kirkpatrick, Corporate Accessibility Engineering Lead
- AT Member: QualiLife - Claudio Giugliemma, CEO
- At-Large Member: Microsoft - Rob Sinclair, Director of Accessibility
Working Groups and Current Projects
The AIA conducts its activities via working groups composed of representatives of member companies, each focused on one project.
The first four AIA projects include:
- UIA Automation/Express Specification: Cross-industry review of the specification and work to extend the specification to include support for rich documents, such as those with tables
- Internet Explorer Support of Web ARIA via UIA Express: Work to enhance Microsoft's IE browser to support these additional specifications
- Interoperability of Accessibility APIs: UIA and IAccessible2: Work to extend UIA and IAccessible2 so that accessibility functionalities of products built to these specifications can be interoperable
- Common Keyboard Shortcuts for AT Products Used with Web Browsers: Work to develop a set of keyboard shortcuts that can provide consistent device command functionality across different browsers
Additional projects, as approved by the AIA Steering Committee, will follow
Approval of Last Meeting's Minutes
Minutes of Expert Handlers Conference Call, 17 December 2007
Topic 1: Update on Progress: Expert Handlers Flow Control and Unified Use Case Drafts
- Expert Handlers Use Cases Drafts Index
- Unified Use Cases Drafts Index
- Expert Handlers and the Flow of Control (Draft 1)
Topic 2: Review of Unified Use Case Draft 2.0
GJR: my first pass at a unified use cases draft was simply a compilation of individual Expert Handlers Use Cases; the second draft added:
- took our mission statement used introduction
- need to contact Vladimir directly -- edits to his braille use case draft most extensive; needs review to ensure that it still states what he meant it to state
- took quick pass at attempting to reduce redundancy -- need to take a closer pass
ACTION GJR: contact Vladimir Bulatov directly and ask for a review of the edits to the braille use case section of the Unified Use Case Draft, 2.0
NS: used the first person occassionally -- needs to be edited out
NS: Navigation -- more detailed than rest of sections -- lifted up in order, or do others need detail?
GRJ: syncs with the Meta Questions i had about the draft:
GJR: 1. braille - capitalized "b" or lower-case "b"?
NS: NFB advises lower-case for "braille" in reference to the tactile system
JS: capitalized when a proper name reference to either a person or a standard reference
RESOLVED: braille shall be spelt with a lower-case b
GJR: meta-question 2: in what order should the use cases be listed?
NS: speech most obvious, and probably should be first
GJR: no problem with that
JS: no opinion yet -- need to reflect
NS: here are 4 use cases for an expert handler -- 1. speech 2. navigation 3. magnification 4. braille
JS: also in reverse alphabetic order
RESOLVED: the order of the use cases will be: 1. speech; 2. navigation; 3. magnification; 4. braille
GJR: third meta-question: are speech input and speech output a single use case?
JS: think they are 2 different things
NS: i agree -- speech input like alternative keyboard input
JS: have an outstanding action item on that
NS: maybe need more generic input section/use case --
JS: mappable input?
NS: possible use case category
JS: generic aspect -- highly successful alternative input solutions
NS: question is: is there anything that is needed by an expert system to deal with input that aren't covered by general case; for example, using eye-tracker -- is there a specific input mechanism that is used?
JS: context-sensitive input mechanism in conjunction with alternate input (autocomplete);
NS: restricted vocabulary
ACTION JS: post to list alternate input / generic input -- bulleted list or prose
Topic 3: Next Steps, Finalization, Workgroup Review, and Publication of a Unified Use Case Document
NS: in general, when re-read, thought "that's not bad!" -- needs some editing before being brought to full Open A11y workgroup, then what?
JS: get to publish as use case document -- how do we make this happen?
NS: formal announcement?
JS: yes, need to do that
NS: need more visability -- need more bodies if going to get the work done
GJR: Linux Foundation press release?
GJR: push for mid-to-late january
JS: 2 meta-concerns: first, is versioning/numbering of draft -- think this is draft 2 not draft 1b; it's a major revision
GJR: ok, will fix
JS: second: concerned that we split out case for technology from how might be implemented; somewhat difficult to keep clean in the past
GJR: any feedback welcome and appreciated
JS: cleanly state use cases then because of these use cases, we have a requirement for X, Y, and Z be clearly specified or delineated in markup
GJR: i will try and draft requirements
JS: requirements specify kind of things handler technology needs to provide and AT and IT in general will need to support
ACTION GJR: draft requirements section and vet on-list
JS: think we are close to finalizing; next steps: finding funding for a developer -- need to speak with AT people; reaction looking for is "yeah, that makes sense, people need it, we don't want to do it ourselves, tell us how and we'll implment"
GJR: still need to do more background checking into where the UWA at the W3C stands -- natural ally in that they, too, need a generic handler that also includes a purpose element, as well as one which can be reused to deliver to multiplicity of modalities -- need a generic handler to build upon before can construct an expert handler
NS: should address the flow of control
JS: incorporate -- best to have all in one place -- ease of review, etc.
Identify Items for the 6 January 2008 Conference Call
- adjourned: 2:45 PM EST/1945 UTC
- next Expert Handlers call: 6 January 2008