phetsims / a11y-research

a repository to track PhETs research into accessibility, or "a11y" for short
MIT License
3 stars 0 forks source link

Braille reader output during custom interactions #163

Open emily-phet opened 3 years ago

emily-phet commented 3 years ago

In a recent interview, a participant was using a Braille reader, and was unable to get Braille output of the descriptions when using the Both Hands mode.

@terracoda and I thought this was unexpected. @zepumph and/or @jessegreenberg confirm if this is to be expected, or if this is an implementation issue?

zepumph commented 3 years ago

I think that this will be challenging for me to tackle without having a braille reader, or at least understanding a bit more about them.

The context and object descriptions of the both hands interaction run completely on aria-live. Was the user able to get any context responses like when adjusting the controls on the left? Is there a recorded clip where I could gain some context about how a braille reader works?

emily-phet commented 3 years ago

My question was first just verifying whether or not Braille reader output should be expected for custom interactions. I wasn't sure if that was the case.

It didn't come up until later in the interview that the participant had not been able to access Braille output consistently, as he was also hearing the descriptions aloud. I'm not sure if the participant and @terracoda delved deeply into what was providing Braille output and what wasn't more deeply. @terracoda can comment on what she interpreted as being what was and was not providing Braille output.

terracoda commented 3 years ago

I'll review the video. I think there was speech output, but no braille output which does seem odd to me. I would assume that the output channels would be identical. My understanding of a braille display is that is displays the same stuff as the screen reader is speaking, but that assumption could be very wrong since one can use one without the other. They two pieces of equipment are not inherently doing the same thing.

On the technical code side, I am wondering if it the same case for BASE's balloon and Friction's Chemistry book. If so, it's just something we haven't noticed before due to circumstance.

In the case of the two-step drag interaction (BASE and Friction) and the immediate drag interaction (RaP), the launching of ARIA live is different, so I am wondering if that may have something to do with it.

Regardless, this is an interesting problem.

terracoda commented 3 years ago

According to https://www.boia.org/blog/how-braille-displays-help-users-with-visual-impairments-access-the-web, braille display should be able to display text-based content, so in theory the device should be able to display text delivered via a live region just like a screen reader does.

Hmm, according this MDN resource, there may be a need for 2 channels, one for screen reader output and one for braille output. I am not sure how this works, actually.

https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/ARIA_Screen_Reader_Implementors_Guide

@zepumph and @jessegreenberg, I am curious how our utterance queue works in comparison to the list described on the MDN page (linked above).

Does our utterance queue have those kinds of options built in?

I am wondering if improvements in the support for the complex aria-attributes associated with live regions is falling into place? I am curious if/how the utterance queue is taking advantage of this new support, if in fact there is really good support?

zepumph commented 3 years ago

@jessegreenberg and I reviewed this issue during our meeting today. We don't really understand why the braille output wasn't occurring, the documentation over in https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/ARIA_Screen_Reader_Implementors_Guide made us think that it was perhaps a browser-specific issue:

Optionally, create a second, additional queue if the user configures a second hardware channel: If there are two channels for presentation (e.g. text to speech and a Braille display), then two queues can be maintained to allow for parallel presentation. The channels could be user configured for presenting live regions based on role or politeness.

Since this is optional for browsers to implement, perhaps some of the time the braille device isn't set up to parallel the description coming from aria-live via the screen reader.

@zepumph and @jessegreenberg, I am curious how our utterance queue works in comparison to the list described on the MDN page (linked above).

Our "UtteranceQueue" has nothing to do with the "queueing" described in the above link. That MDN article seems to be describing the desired implementation of aria-compliant implementors (browsers) and not web page creators.

@terracoda what device/browser was the user on? Was it their go-to device/browser or one that we recommended them to use?

@jessegreenberg and I feel like it may be hard to make progress on this without having a braille device to prototype with and reproduce the problem.

terracoda commented 3 years ago

The participant was using NVDA and FF on W10. Their goto SR and browser, and a Braille display. I did not note down the brand name of the Braille display.

terracoda commented 3 years ago

I also don't understand why the text is not coming through to the Braille Display.

zepumph commented 3 years ago

@emily-phet can you please recommend how to proceed with this? Should QA get a braille reader to test with?

emily-phet commented 3 years ago

I think we should just do some light investigation into this for RAP, to determine if this is a general problem for Braille readers or for a subset of scenarios with Braille readers. For a start, we could reach out to our consultants that use screen reader devices and ask if they happen to ever use a Braille device as well. We could also reach out to the CU Usability and Accessibility Lab and see if they have access to a device and potentially someone in their lab who would spend some time with us to try out a few things.

If this feels like too much for right now - it will need to involve @terracoda to reach out to folks - we could do this in 2021.

terracoda commented 3 years ago

I did some light investigation by posting a few questions on the WAI-IG list. Apologies for not getting comments into this issue sooner.

It seems like this issue maybe screen reader dependent and we might want to submit a bug to NVDA.

Here are the questions I asked about screen reader software (SR), braille displays (BD), and ARIA live regions.

  1. Is it safe to assume that text is text and that the screen reader software (SR) and braille display devices (BD) handle or display text-base content in the same way, i.e. through the same channels simultaneously?

  2. In theory, is text marked up in an element properly designated as an aria live-region in some way different than text marked up in a paragraph tag? I mean really different once the text is recognized by the SR or the BD.

  3. Since users often use both SR and BD together, are there special situations that web developers should be aware of to ensure text is accessible and deliverable by both technologies simultaneously? Especially, in the case of custom interactions and where aria live-regions are being employed.

Here are snippets of the responses I got:

Braille Displays are controlled by screen reading software – they are not separate entities in one sense. Having said that, screen readers may treat audio, and Braille messages etc. differently. For example, in Braille a message may appear, and is displayed for five seconds, then disappears! (David B. Andrews)

Well, the good news for you is that NVDA, JAWS and VoiceOver (Mac) all have a Braille Viewing feature, for the latest JAWS and VoiceOver versions, I believe you can see the Braille dots and a translation of those, I am not sure about NVDA. So, you can see the actuall behaviour. Depending on the content type, most screen readers will update content dynamically on the braille display where it would usually be considered dynamic. So if the Braille cursor is on a slider, if that slider changes value, the Braille display will change fairly dynamically as its value changes. Note, I mentioned a Braille cursor, for JAWS and NVDA What is in the Braille display can be quite different from what audio is being produced. However as mentioned by others, Braille displays might show flash messages, these are messages that will appear on the display for a couple of seconds (configurable by the user) I believe live regions are categorized as things to show in flash messages. (Jonathan Cohn)

In my experience there are limitations in what aria-live being announced by some screen readers. People who are deafblind would rely on braille output from screen readers. There are speech modes on the braille display allowing users to display what was announced in speech but these modes may be difficult to enable and switch back and forth. I would reach out to screen reader vendors to find out what can be done to change how they work with aria-live regions. (Jonathan Avila)

I provided clarifications to our context in between these nice responses which I have not posted here.

But it seems we may want to first reach out to NVDA about how they work with live-regions.

terracoda commented 3 years ago

Latest helpful information from w3c-wai-ig email list (Sean Murphy):

On this topic the general rule with two of the four commonly used screen readers:

  • VoiceOver on mac or ioS using Braille shows the same information being spoken.
  • Jaws does have different levels (Structured, Line and screen reader announced) – I have not tested Jaws with a Braille display with Live regions. Something I will do in the near future.
  • TalkBack does not support braille. You have to install the Braille Back app. How it handles live region I do not know.
  • NVDA need to investigate.
  • Narrator the same.
zepumph commented 3 years ago

Thanks for the updates @terracoda. It is so awesome that you reached out and got so many responses back. I really appreciate it.

I want to reiterate https://github.com/phetsims/ratio-and-proportion/issues/233#issuecomment-725026109 here:

The participant was using NVDA and FF on W10.

So from all the above information, we seem to know the least about NVDA.

Next steps from here:

@emily-phet does this sound like an appropriate path forward? Please let me know your thoughts, especially about if this should hold up RaP.

emily-phet commented 3 years ago

@zepumph This approach sounds great to me!

I reached out to the CU Accessibility and Usability Lab to see if they had a Braille Reader device, and if they had anyone on their team with knowledge of Braille Readers who might work with us to understand more. I'll update if/when I hear back.

They're very friendly folks so I assume I'll hear something.

emily-phet commented 3 years ago

I just heard back from the CU Accessibility and Usability Lab. They have someone who does testing with them who uses a Braille Reader, but it has recently broken. They're getting the problem fixed, and they will put us in contact with that person once that's done.