Esri / a11y-map

A11y map testing
Apache License 2.0
22 stars 5 forks source link

Assistive technology testing #20

Open geospatialem opened 5 years ago

geospatialem commented 5 years ago

Hi all,

Thank you all for creating such an awesome map and working diligently to provide an accessible map option to the mapping community! :world_map:

Some colleagues at my organization had a few questions regarding the testing you've been able to conduct, and are wondering if you have been able to test some of the items below, including:

1. Has there been testing with text-to-speech software? :loudspeaker:

2. Does the "speech" button (that appears when a user tabs into the app) that triggers audio interfere with screen reader assistive technology? :speaking_head:
a11y map speech button screenshot

3. Do the keyboard navigation buttons (, , , and ) interfere with the screen reader keyboard commands? :keyboard:
a11y map with navigation box activated

4. Are the popups keyboard accessible once a screen reader is running and the screen reader takes over the keyboard? :bell:
a11y map with popup displayed

Thank you for your hard work on this project!! :muscle:

Best, Kitty

nwisehart commented 5 years ago

Hi Kitty,

I'll do my best to answer your questions although my experience with screen-readers is limited.

  1. I've tested the app using VoiceOver on Mac, but I haven't had the chance yet to test with JAWS or another screen-reading software.

  2. The speech button shouldn't interfere with a screen reader in any way. It takes priority in the tab-index which a screen=reader shouldn't have problems with, then shows hidden fields to help navigate the map. These fields should provide more context and direction when using the app and shouldn't interfere with the screen-reader.

  3. We used the arrows bc they seemed intuitive. I'm not sure if they interfere with other screen readers but they seem to work fine with VoiceOver. We will need to test to make sure that they don't overwrite functions that are assigned by screen-readers.

  4. Yes the pop-ups are screen reader accessible. They exist as an empty element prior to being populated with content so they aren't breaking the html that the screen reader uses. Rather, the pop-up element is populated with the relevant content, then focus is moved to it.

Let me know if you have any other questions and please let us know how you like using the app and if you have any other feedback.

Thanks, Nick

geospatialem commented 5 years ago

Thanks @nwisehart! :+1:

I'm checking in with my colleagues in the next few days and will report back by early next week with a follow-up (if any).

Thank you for the quick reply, and nice work on this project!! :muscle:

geospatialem commented 5 years ago

Apologies for the delay in a reply, @nwisehart. Thank you again for your insights and work on this effort! :muscle:

A few items in response to your reply based on some internal discussions...

1. Text-to-speech: :speaker:

In response to the text-to-speech testing, we're specifically wondering about speech recognition assistive technology (AT) testing, such as Dragon speech recognition used for individuals with low vision and mobile/dexterity impairments. Dragon's company, Nuance, has a good background on Dragon's accessibility.

Sorry about the confusion, there was a translation lost in my initial post. Has there been any testing with Dragon speech recognition software, or a similar product?

2. Speech button: :speaking_head:

The WCAG 2.0 guidelines, 1.4.2 (Audio Control) mentions:

"If any audio on a Web page plays automatically for more than 3 seconds, either a mechanism is available to pause or stop the audio, or a mechanism is available to control audio volume independently from the overall system volume level. (Level A)"

Since the audio is activated by a button, this may fit the WCAG guidelines, but once activated is the button given focus so users can deactivate it within 3 seconds?

The verbalized information would also need to be made available to all users, including screen reader users, which if they use VoiceOver works, but not sure if that translates to screen readers (e.g., JAWS and NVDA) in the same manner.

However, this functionality is really useful for users with cognitive disabilities, so really appreciate this feature!

3. Keyboard navigation and screen readers: :keyboard:

We did some testing with JAWS and found the navigation arrows interacting with the map navigation conflict when JAWS is open and activated on the users machine, as the arrows interfere with the screen reader commands. This functionality is across the Esri JS v4 API, not limited to this specific application. (We have not conducted testing with NVDA at this time).

We're not sure what solution could be implemented to avoid the issue, other than possibly adding a dialog that screen readers could read as the web mapping application loads (e.g., a sr-only class, or similar)?

4. Pop-ups: :bell:

Awesome, thank you! Since the :focus is being posted to the pop-up dialog window once its opened, we agree, it's a great enhancement in the v4 API. Nice work! :+1:

5. Miscellaneous comments: :speech_balloon:

We noticed a great deal of support for visually impaired users (which is great!), however we're also also wondering about support for all/other users, such as: auditory disabilities, motor disabilities, cognitive disabilities (e.g., memory, problem solving, attention, reading, etc.)

We're also working on providing alternatives to these audiences. There isn't as much documentation out there, but we're using the WCAG 2.0 guidelines to guide us along the way.