Proposal Audacity 4 Blind

From Audacity Wiki
Revision as of 11:44, 19 May 2015 by PeterSampson (talk | contribs) (Text replace - "plugin" to "plug-in")
Jump to: navigation, search
Proposal pages help us get from feature requests into actual plans. This proposal page is about accessibility for Audacity for blind users.
Proposal pages are used on an ongoing basis by the Audacity development team and are open to edits from visitors to the wiki. They are a good way to get community feedback on a proposal.


  • Note: Proposals for Google Summer of Code projects are significantly different in structure, are submitted via Google's web app and may or may not have a corresponding proposal page.

Proposed Feature

Instead of using wxAccessible to make controls within Audacity accessible for blind users, this proposal splits Audacity into an audio engine and a GUI. The GUI handles visual-only aspects, bitmaps and mouse clicks and drags. The underlying audio engine handles the audio and selections. The interface between the audio engine and GUI is defined in files that SWIG can use, allowing us to script Audacity. So Audacity 4 Blind is closely related to Scripting. There is a concept of a 'clutch'. When it is engaged the visual GUI will track requests and changes made to the underlying audio engine. The GUI is plug-in, so it does not have to be present. In principle we could plug-in different GUIs.

The above gets us about half way to where we want to be with Audacity 4 Blind. It gives us a script based interface that exposes all the features. Proposed refinements include:

  • Value tweakers - the ability to temporarily bind a variable to keyboard so that (for example) left and right arrow increase and decrease the value, and the step size can be varied while listening to the audio (depends on real-time improvements such as the real-time looping).
  • More sophisticated keyboard shortcut methods.
  • Efficient audio help information. For example we want to be able to list the functions that can be applied, and the user should be able to navigate through this list quickly without having to listen to it all.


Developer Backing

  • James
  • Leland


Use Cases


Details

So how exactly will the 'efficient audio help information' operate? Current thinking is in terms of trees, like existing menus, but rather more deeply nested and with each level shorter than current menus are. One problem blind users face is not knowing how long a list is before listening to it! Something complex like effects might be offered as:

84 Effects
7 Categories
9 Technologies
12 Alphabetic groups

You then choose whether to explore by categories (time-preserving, reverbs, repair... ) or by 'technologies' ( Built-in, VST, LADSPA, Nyquist....) or by Alphabetic ( A-B, C-D, E-F.... ). For alphabetic you would usually type the letter, e.g. R and start hearing all effects that begin with R. We allow items to appear more than once in the menu, since the different organisations will give better options for navigating.

We'll want some standard exploration methods besides, up, down, previous, next. Possibly 'help' to go read the manual. Possibly 'curt' to ask the system to say things as briefly as possible for when revisiting lists that have been navigated before.

Typical blind users will set hotkeys into location in these menus that are useful to them. At worst this menu system will be no worse than a conventional menu system customised for blind. With the extra smarts it could be a lot better. We'll be sure to make sure every feature of Audacity is exposed by some 'menu item'.

In all this I am assuming that we are still using a third party screen reader to give accessibility to the text.

Experiments