revery-ui / revery

:zap: Native, high-performance, cross-platform desktop apps - built with Reason!
https://www.outrunlabs.com/revery/
MIT License
8.06k stars 196 forks source link

Proposal: Accessibility #293

Open bryphe opened 5 years ago

bryphe commented 5 years ago

One key issue with Revery's flutter-like approach to UI widgets is accessibility. Using the native system widgets gives you accessibility "for free", but not so for Revery - the framebuffer is just a meaningless array of pixels w/o some extra support 😄

We want to give a toolkit with Revery so that users can build accessible apps, and also ensure that our out-of-box widgets are accessible. This is just a brain dump of some ideas; would appreciate feedback or help!

API

I'm not an expert on accessibility, but did some testing with MSAA as well as ARIA on the web. Would welcome any critiques from experts!

To start, I'd like to propose adding an <Accessible /> component, inspired by MSAA's values:

Usage could be something like: <Accessible role="button" name="Submit" state="focused" />. Could be much improved to use real types for values instead of just strings - but perhaps a starting point.

Note that this is much simpler than the ARIA specification, but also much less expressive. Many of the ARIA primitives could be built on-top of this (for example, a set of ARIA components could be created that could be using this <Accessible /> component under the hood, but express more complicated relationships, like aria-labelledby).

We'd need a set of native APIs to interact with the platform's particular accessibility framework. These would be impacted by the primitives we decide to expose via the <Accessible /> component.

Implementation

For any platform, we'll have to maintain a 'virtual' hierarchy of accessible elements. This could be implemented on each platform in the following ways:

Testing

It's also important to test with screen reader applications like JAWS.

bryphe commented 5 years ago

Interesting proposal here for the JS side: https://github.com/WICG/aom

The AOM (Accessibility Object Model) is a way to specific an accessible hierarchy w/o DOM nodes. One use case would be to create accessible React components w/o ARIA. Could be interesting, and lighter-weight than DOM for our JS implementation.

wishfoundry commented 5 years ago

I was just about to ask about this, but I see you already had some thoughts.

I think one concern I would have is integration with the web, in that any design you consider should include web as a priority.

IMHO, Accessibility in general has 2 basic categories, one being navigation, and one being discoverability.

For web today, discoverability is probably not possible given that revery renders to canvas. There's no options I'm aware of as far as registering that your element contains multiple aria regions/landmarks or ways to surface it's navigation subtrees.

controlling navigation seems a short term feasible goal, in that one could at least setup onBlur/onFocus listeners to enable focus trapping within the canvas element and yield control to the revery app.

One proposal I might have is letting all of this live in it's own hook, and opening up revery to the one of the promises of algebraic effects with support for user land hooks. This would allow for several community accessibility solutions to compete until the obvious best practices surface, or just allow users to select the best tool that works for them

chinwobble commented 3 years ago

@bryphe I like your idea about maintaining a virtual hierarchy although it sounds difficult. I see it could have other benefits like debugging similar to the react debugging tools.

To get something started, I think we can start by supporting basic functionality in Windows. We can copy the uiaTracing.h and uiaTracing.cpp files into the src/native of Revery. https://github.com/microsoft/terminal/pull/4826/files?file-filters%5B%5D=.cpp&file-filters%5B%5D=.h

This will allow us to start publishing ETW events like what button has focus, text selection changes etc.

A simple proof of concept could be a simple Revery app with a few buttons. When you press tab to change focus focus, ETW events are pushed. Ideally all the out of the box components will be screen reader accessible without adding any extra components just like html.

For my use case, I would like to write articles using oni2 today and proof read them with a screen reader (NVDA). I would need the cursor position, and the editing area to be pushed.

zersiax commented 2 years ago

Has any progress on this been made? As it stands, I can currently neither use apps written using this toolkit, nor use this toolkit myself as I would not be able to test the UI I have created, or defend using an inaccessible UI toolkit as a blind developer using a screen reader. Would love to hear more :)