Some discussion on the draft for the mission occurred. A new version was crafted at this page.
GR gave an overview of the work that he has done and everyone agreed with it. He also (in almost real time) updated the mission statement.
VB gave an overview of braille and expert handlers. He talked about the problems of using Unicode characters for braille and using ASCII. JS remarked that a Unicode deliverable as discussed last week would solve that problem.
VB said that tactile display is a good fallback for when there isn't a braille code for the subject matter. Even when there is a braille code, the language of the user needs to be taken into account.
SD: I think language and type of braille should be handled by the AT.
GR: We need a natural language indicator/switch. lang="en-uk" would be British braille rules lang="en-us" would be US rules, etc.
SD: Yes, with that the AT can decide how to render, similarly to the way WindowEyes and Jaws switch languages when then encounter a language change in Word or HTML. Maybe we have to have a way to move into sections of the data. Like being able to drill into text, like for complex equations go into () or move on fractions as they were shown. Chemicals could be structure and be able to drill to the atom level.
Action Item: think about a potential CSUN presentation on our work.
Next Meeting: Sept 10