department-of-veterans-affairs / va.gov-team

Public resources for building on and in support of VA.gov. Visit complete Knowledge Hub:
https://depo-platform-documentation.scrollhelp.site/index.html
281 stars 197 forks source link

VADS Components - Define accessibility testing process #84110

Open humancompanion-usds opened 3 months ago

humancompanion-usds commented 3 months ago

Description

Original ticket title: Define & document how we test components for accessibility on the Design System team and on the Governance team

The VA Design system must ensure that the components it offers have been thoroughly tested for accessibility in a wide range of assistive technologies. The Design System must document that testing so that VFS team know what has already been covered by the Design System Team and Governance team. The Design System needs a page that describes the level of accessibility testing performed on components. This should detail the tech stack the components are tested with, how often they are tested (change boundary that would cause a new test to be performed), and a sampling of tests performed or a link to a specific testing plan.

@briandeconinck wrote up an excellent overview of our a11y testing and tools. Part of that was recommending the following:

I think a good approach would be to test with different levels of rigor depending on what’s being tested: Things that are repeated throughout VA.gov are fully tested with JAWS, NVDA, VoiceOver, and Narrator on desktop, plus VoiceOver and TalkBack on mobile. This includes: Design system components...

I agree. Thus we're going to start with Design System components.

6/5 note: Don't necessarily want to limit this to screen readers only.

Details

This is a joint initiative between the Design System and Governance teams. What I'd like is for @briandeconinck and @rsmithadhoc to determine between the two of them what the gaps are between the goal above and what we do today. Once that is documented (can be an update to Brian's Confluence doc or a comment in this issue), please create a plan for closing that gap. That may include, but is not limited to, a plan to obtain devices or software and a clear delineation of tasks between Brian and Ryan (plus the engineers on the Design System team). To be clear, I believe covering 6 different tech stacks for all tests is likely too much for just Brian to do. Thus I'd like Ryan and the Design System Team to commit to taking on some of that work and for both teams to define and document who is doing what and when. This will mean an update to the current Staging review process for Design System components.

The majority of this initiative is focused on defining the testing we want to be doing for both VA.gov and the mobile application (the former can come first). As we define that if we also are able to note where we think our testing may have been deficient, a remediation plan would be an excellent addition as our next task will likely be re-testing some of our more troublesome components and/or re-testing our most common components on tech stacks that we have not previously been able to test on. This will be necessary to make our revised standard true across the components in the system.

Tasks

Acceptance Criteria

shiragoodman commented 3 months ago

Thanks for this @humancompanion-usds . I have a few questions:

Maybe @caw310, you and I can have a brief discussion to ensure we're on the same page.

caw310 commented 3 months ago

@shiragoodman , for initiatives, I usually create them in the va.gov repo while all other tickets are created in the vets-design-documentation repo. You and I can chat about how best to handle a joint shared initiative.

shiragoodman commented 3 months ago

Initiative has been transferred from the vets-design-system-documentation repo to the va.gov-team repo.

humancompanion-usds commented 3 months ago

Once that's determined, is there a prioritized list of components DST will be prioritizing for retesting? Or will be rely on previous testing performed by Governance team at DST Staging Reviews?

If Brian and Ryan feel we should re-test components then we'd pick those that have the most instances across VA.gov.

the next steps for DST would be to get the VADS pages updated with the testing details? And the next step for Governance would be updating our DST SR guidance to ensure all future component testing is performed the same way?

Yes and yes!

shiragoodman commented 3 months ago

@humancompanion-usds I hope you don't mind... I changed the title of this epic to shorten it for easy reference with my team. Please feel free to edit, or return to the original title if you'd prefer.

artsymartha68 commented 2 months ago

Love seeing this! Thanks for taking this on.

artsymartha68 commented 2 months ago

I tripped upon a couple font sizing defects this week because of the way I chose to set these options. I think more than a couple folks with older eyes like me may do the same.

It would be good to add to the testing procedures and handle these cases as to what we're going to cover for VADS components. There truly are so many ways to set font sizes, this is not an easy task, by any means!

  1. On my Mac, in Chrome, I did Settings > Appearance > Font Size (I chose large). This caused some weirdness in the mega menu.
  2. On my iPhone 13 mini, I set my phone settings to use Large Fonts via the Accessibility options. As a result, in Safari, the margin on VA.gov body content was zero, the text was smushed up to the edge of the browser window. This effect did not show in emulation, only on-device.
humancompanion-usds commented 1 month ago