Accessibility/Planning/Android

From MozillaWiki
Jump to: navigation, search

Note there is a recent Google IO video on Android Accessibility.

Questions

  • What is the Google approach for Android browser accessibility? (Talkback vs ChromeVox) (Trev we know its all chromvox now don't we?)
  • Are new API planned that would allow non focused elements to send AccessibleEvent or query information about arbitrary Views? (presumably the android fork david mentioned in passing would have these?)


Implementation ideas

  • We can use two different approaches:
    • Inject ChromeVox into firefox mobile, which directly uses Android framework TTS to do selfvoicing (Both, Chrome OS and Android native browser does this way)
    • Inject FireVox
    • Use our own accessibility layer and based on our accesible events fire AccesibilityEvent (so Talkback would speak)

Tasks

In order to evaluate both options, we need to figure out some stuff:

  • Check if we can easily expose Java objects from the main application to our js engine (for TTS calls inside ChromeVox) This fits nicely with the TTS in java script using native apis project, but would be a dependancy of this.
  • Check if the interation model of ChromeVox works ok with us (does it speak mostly when something is focused? if so, are we moving focus in the same way as WebKit does, allowing all elements to be focused?) Does it use DOM or other apis that we also provide?
  • Explore if we can achieve similar results using our accesible code firing AccesibilityEvent
  • Check accessibility of Firefox chrome widgets (dialogs, preferences, etc...) Are these native android widgets or are they ours? If they're ours then it sounds like we'll have to deal with coordinating the firevoxen in two processes (how would this work)?