Closed pglevy closed 3 years ago
Here's a list of changes that were made to a heuristic evaluation plan for hhs.gov:
Expert evaluators - For somebody to go through this process as an evaluator we would need to assume good knowledge of usability practices. Philip recommended speaking with a group of UX designers (from Bixal and/or Webfirst) who don't have a ton of familiarity with the website.
Heuristics list - Establish a common set of Heuristics for everyone to refer to and measure the site against instead of having evaluators create their own. Emilia created a hybrid list of Heuristics from NN and the USDS playbook.
Scope of evaluation - We don't know specific user tasks yet (why people are coming to hhs.gov and what they are doing) so we shouldn't ask them to do super specific things. Philip recommended identifying key pages or site sections you'd like people to explore. Navigate them to certain sections and ask them to measure what they see against pre-determined heuristic benchmarks.
@e-nardi , we can talk more about this, but here's what I'd like to do:
Looks some HE work done on SSA toward the beginning we can reference as well.
How's this one coming along, @allie-shaw?
Ready for feedback from @Bixal/methods
Hey @allie-shaw, nice work! It's a very clean and easy read.
I have a philosophical question to @Bixal/methods group, which might apply to other methods as we are tweaking them, and maybe it's a broader discussion for a later time, but I am capturing it here not to forget:
I took the approach of a lighter edit of what exists today to get this out the door. Main changes:
@e-nardi and I talked about some shortcomings with current instructions:
As the conclusion of Emilia's current activity, we will revisit and update the instructions to address these issues and whatever else we learned from going through it.