department-of-veterans-affairs / va.gov-team

Public resources for building on and in support of VA.gov. Visit complete Knowledge Hub:
https://depo-platform-documentation.scrollhelp.site/index.html
284 stars 206 forks source link

Rate your experience feedback mechanism. #65148

Closed ATMiddleton closed 1 year ago

ATMiddleton commented 1 year ago

Issue Description

Create and/or Update Feedback survey designs

Tasks

-Draft: Research Plan for Search & Discovery Team, Resources and Support, Submit Feedback Tool 2.0 Background Resources and support (R&S) contains tier 2 content (benefit-adjacent content). The Submit Feedback Tool was introduced to R&S article pages last year with the option to rate articles as either good or bad. As a part of the Submit Feedback Tool 2.0 product initiative, once a user rates an article as 'bad', we would like to implement a way for users to report the reason why they rated the page poorly. We would also like these issues to be actionable and easily accessible for the content owners so that improvements can be easily made.

We would like to make improvements to the layout of R&S article pages.

Discovery: Resources and Support Article Pages

Resources and Support Roadmap Session Summary

Learning Center Original Reseach Folder

Learning Center Moderated Usability Study [Link to product brief coming soon]

OCTO Objectives Improving Resources and Support will allow Veterans and those in their support system to find information on utilizing their VA.gov account, gaining supplemental information on benefits, and easily find links to apply for benefits or make changes to their account.

Veteran Journey Resources and support impacts the Veteran journey from Starting up and all phases thereafter.

Research Goals and Questions

  1. Validate that users are able to easily use the Submit Feedback Tool to rate the article page and report an appropriate problem. Are users able to find where they can report feedback? What do users expect to see after they submit feedback? In what situation would a user rate this page? Why would they rate it 'good', why would they rate it 'bad'? Does the wording make sense? If not, what should it say? Do the options under "How would you rate your experience' make sense to users? What other types of problems would they expect to report? What do users think about the amount of issues presented to them? Is the amount overwhelming? What are their perceptions on the difference between this feedback tool and Medallia? Content creator question: What type of feedback is helpful and actionable to them? How can we make this feedback more accessible to them?
  2. Are users able to easily navigate and discover the rate your experience feedback mechanism on the article pages? Do users utilize rate your experience feedback mechanism on the article pages? Do users utilize the feedback button near the footer on the article pages?
  3. Is the question "How would you rate your experience" a helpful title that tells users what they will find? What do users think of the title, "How would you rate your experience"? If users do not feel like "How would you rate your experience" reflects what they find on the section, what else would they name it? Outcome We will confirm whether:

Updates allow users to provide feedback which will inform content editors on changes that can be made to improve R&S articles Updates to R&S article pages improve navigation and discoverability of more R&S articles The name "Resources and support" accurately informs users of what content they will find Hypothesis Users will not know the purpose of the rate your experience feedback mechanism, The good/bad buttons that are currently used to track feedback, are not helpful as the data is recorded in Google Analytics. Users will be more likely to use the feedback button near the footer to leave feedback on the Article page Method We'll conduct remote usability testing over Zoom using a prototype.

Location Zoom

Research materials For moderated usability tests:

[Link to conversation guide](url goes here) [Link to prototype](url goes here) Recruitment Recruitment approach We will recruit Veteran participants using a lean maximum variation strategy. We'll leverage Perigean's recruiting services to find our participants.

Recruitment criteria Schedule 10 Veterans for a minimum of 8 successfully completed sessions.

Primary criteria (must-haves)

50% of participants should be able to participant from a desktop device.

50% of participants should be able to participate from a mobile phone.

Secondary criteria (nice-to-haves)

Gender

8 women, 8 men Inclusion

3 participants who use assistive technology (e.g. screen reader, magnification) 10 participants who identify as other than white 5 participants who identify as other than straight/heterosexual and cisgender 8 participants age 55+ 8 participants who identify as having a cognitive disability 5 participants who do not have a college degree 5 participants who live in a rural area Timeline If you are using Perigean to recruit please submit 1 FULL week prior to the start of research for remote research, 2+ weeks for in person.

Prepare When will the thing you are testing be finalized? (Goes without saying, but should be a few days before testing will begin.)

Please indicate the date and name of a mock participant for a pilot session.

Pilot participant email: Date and time of pilot session: Research sessions What dates do you plan to do research?

Length of sessions 45 minutes. Perigean will schedule the sessions with buffer 15 minutes of buffer time to allow for participants who can't make it on time, or if you might go over time.

Availability When would you like sessions scheduled? Please list exact dates and times in EASTERN Standard Time. Note: we recommend providing availability outside of work hours, as many Veterans are only available before and after working times, and live across the U.S. Please request enough dates and time slots for the number of requested participants. (e.g. Monday 9-1, 3-6; Tuesday 9-6, etc.).

Team Roles Moderator: Camille Green ,404-428-2313,Camille.green@oddball.io Research guide writing and task development: Camille Green Participant recruiting & screening: Perigean Project point of contact: Camille Green Participant(s) for pilot test: TBD Note-takers: Perigean, Aubrey Archangel, Sarifa Khalilullah Observers: chante.lantosswett@va.gov,anita.middleton@oddball.io

Acceptance Criteria

UXCamG commented 1 year ago

High fidelity mockups and Userflow: https://sketch.com/s/dbc369e6-21ae-4717-a169-a5e734934f93

UXCamG commented 1 year ago

Research Plan for Veterans Support Team, Resources and Support, Rate your experience feedback mechanism.

Background

Resources and support (R&S) contains tier 2 content (benefit-adjacent content). The Rate your experience feedback mechanism was introduced to R&S article pages with the option to rate articles as either good or bad. As a part of the Rate your experience product initiative, once a user rates an article as 'bad', we would like to implement a way for users to report the reason why they rated the page poorly. We would also like these issues to be actionable and easily accessible for the content owners so that improvements can be easily made.

We would like to make improvements to the layout of R&S article pages.

See product brief

OCTO Objectives

Improving Resources and Support will allow Veterans and those in their support system to find information on utilizing their VA.gov account, gaining supplemental information on benefits, and easily find links to apply for benefits or make changes to their account.

Veteran Journey

Resources and support impacts the Veteran journey from Starting up and all phases thereafter.

Research Goals and Questions

1. Validate that users are able to easily use the Rate Feedback mechanism to provide feedback on the article pages and report an appropriate problem.

2. Are users able to easily navigate and discover the rate your experience feedback mechanism on the article pages?

3. Is the question "How would you rate your experience" a helpful title that tells users what they will find?

Outcome

We will confirm whether:

Hypothesis

Method

We'll conduct remote usability testing over Zoom using a prototype.

Location

Zoom

Research materials

For moderated usability tests:

Recruitment

Recruitment approach

We will recruit Veteran participants using a lean maximum variation strategy. We'll leverage Perigean's recruiting services to find our participants.

Recruitment criteria

Schedule 10 Veterans for a minimum of 8 successfully completed sessions.

Primary criteria (must-haves)

Secondary criteria (nice-to-haves)

Gender

Inclusion

Timeline

Completion of 8 successful research sessions between October 16 - October 23. Prepare • Research team will organize pilot.

Research Sessions

• Planned dates of research: October 16 - October 23 • We would like to request that Perigean calls each participant to remind them about the session, in addition to emailing them. And please include the session time in each participant's own time zone (from a screener question). Length of Sessions • Session Length: o 45 minutes non assistive technology users o 60 minutes assistive technology users • Buffer time: 30 minutes between sessions. • Maximum sessions per day: 3 Availability: • 10/16: 12pm - 5pm ET • 10/17: 12pm - 5pm ET • 10/18: 12pm - 5pm ET • 10/19: 12pm - 5pm ET • 10/20: 12pm - 5pm ET • 10/23: 12pm - 5pm ET

Team Roles

UXCamG commented 1 year ago

Brainstorming session was captured here: https://jamboard.google.com/d/1dXlRa-x8lCjMzGFwQyNCU83_3UotWQ2HAzPqz_AhZXU/edit?usp=sharing

UXCamG commented 1 year ago

Conversation Guide

Day of the session

Starting the session

Intro - 5 minutes

Thanks for joining us today! My name is Camille Green and Perigean is taking notes. I also have a couple other colleagues in the Zoom waiting room who would like to observe and take notes. But before we get to them...

Today we're going to talk about Medallia's Feedback Survey Tool that we are interested in utilizing that will allow veterans to rate multiple pages on Resources &Support .

Before we start, I have a few things that I want to go over with you:

Start recording.

Warm-up Questions - 5 minutes

Before we get started with the discussion, let’s start with a few general questions so I can learn a little bit about you.

  1. Tell me a little bit about yourself. What do you do in your spare time?
  2. How long have you served in the military? What is your branch?
  3. What do you typically do day-to-day in your current role?

Excellent. Thanks so much for your answers.

Scenario

Imagine you are searching for Resources and Support because you are unable to sign in to your VA.gov account. Before you leave the resources and support page, you want to ensure that you leave feedback to rate your experience, so you expand the rate tool to review the resources and support page in detail. Once you have finished leaving feedback for the resources and support page, you want to return to VA.gov. Today we’ll test a concept for rating your experience when searching for resources and support on VA.gov.

Tasks

  1. Please read all tasks out loud and remember to share your thoughts (e.g., likes/dislikes, feelings, expectations) as you perform each task. To begin, click Next.
  2. [Verbal Response] First, please answer a few brief questions about your role: What is your job title? What roles are you currently hiring candidates for? When finished, click Next.
  3. Now we’ll visit the page we will be testing. Once the page loads, click Next. Prototype

What → Va.gov+ Resources and Support - 30 minutes

o Start on the Home page (VA.gov) o Click on the other search tools. o Open find benefit resources and support.

Let’s imagine you are searching for information on how to Sign-in to VA.gov for veterans. You see a list of topics on the home page. When done, click next.

  1. [Verbal Response] You want to search for a tool that will help you sign in. Scroll down and click in the area to the left of the home page where it says find benefits resources and support, then click. When done, you can see resources and support on the screen. When done, click next.
  2. [Verbal Response] Without clicking/typing anything, please describe how you would find information on how to sign in to VA.gov. When done, click next.
  3. [Verbal Response] I want to direct your attention to the “Browse by topic” title at the top left of the screen. What do you expect to happen when you click the “Signing in to VA.gov” button? When done, click “Signing in to VA.gov”, then click next.

Things to watch for:

What → Resources and Support + Signing in to VA.gov

o Start with browse topics. o Click on Signing in to VA.gov. o Scroll down the Signing in to VA.gov. o Click on Rate your experience. o Select feedback options.

  1. [Verbal Response] You have decided that you are interested in learning more about signing in to VA.gov, so you click on Signing in to VA.gov name underlined in blue on the resources and support page. When done, click next.
  2. [Verbal Response] You want to rate the sign-in to VA .gov page. Without clicking/typing anything, please describe where you would go on the screen to rate this page. When done, click next.
  3. [Verbal Response] I want to direct your attention to the “Rate your experience” feedback mechanism at the bottom of this page. Without clicking/typing anything, please describe what you think will happen if you click the “good” or “bad” button. When done, click next.
  4. [Verbal Response] In what instances would you use the “Good” button? Why? When done, click next.
  5. [Verbal Response] In what instances would you use the “Bad” button? Why? When done, click next.
  6. [Verbal Response] Imagine you did not find the sign in information you were looking for. Click the “Bad” button. Please describe what the “Bad” response has done on this screen. What does the form look like to you? When done, click next.

Things to watch for:

What→ Bad Response/ Ideal Responses/Enhancements/Hesitation for the feedback mechanism

o Overview/thoughts on Rate your experience. o Overview thoughts on updated Medallia form. o Any questions for me?

  1. [Verbal Response] Imagine you want to give feedback on the content of the page. Without clicking/typing anything, please describe if the responses on the form reflect the information you would like to share. What information would you want to add or remove from the form and why? When done, click next.
  2. [[Verbal Response] To return to the results page click the “x” on the top right side of the resume. When done, click next.

Things to watch for:

Consent to use video clips

Thank-You and Closing - 5 minutes

Well we really appreciate you taking the time to share your thoughts with us today. Your feedback is so helpful to us as we continue to work on the site and make sure it really works for Veterans.

Thanks! Lastly, Perigean will be sending you a thank you note with a little blurb that you can pass along to other Veteran you may know to provide them the chance to participate in future research studies.

Thank you so much again, and enjoy the rest of your day!

UXCamG commented 1 year ago

Rate tool feedback mechanism (Medallia for Articles) product outline

Overview

The Veterans support team would like to suggest an expanded version of the feedback tool (Good/Bad Rating) and we want to use Medallia to capture the data. Medallia will offer a combined experience for the Veterans as well streamline operations internally. Additionally Medallia already has analytics capabilities that the Veterans support team can build off of. The expanded version of the rating tool that we are interested in utilizing will allow users to provide details for their ratings on multiple article pages on R&S.

Problem

Currently the resources and support article pages have the option for a veteran to submit whether their experience on the page was good or bad. The veterans support team has been discussing additional messages for selection so we can further analyze the reason for a rating (i.e why was it good or bad). When we discussed adding questions for the article experience we have a couple options to explore in regard to its styling. Our options require expanding the rate experiences towards if this then that decision trees, etc...

Desired User Outcomes

-Users will have a consistent experience when providing feedback -Rating tool will be located in a consistent part of the page and users will know where to look on a page in order to provide feedback -Solving the issue of providing detailed feedback could provide less frustration, more efficiency, ability to submit issues, and just an overall improved experience for the Veterans

Undesired User Outcomes

-Creating a frustrating experience for the Veterans -Creating a more time consuming experience for the Veterans -Veterans may be confused why they are asked different questions on different pages on the website note- research to validate -Inconsistent UI for submitting feedback -Users feel interrupted and/or frustrated by attempts to collect feedback

Desired Business Outcomes

-Streamlined sensible process for Veterans to submit feedback -Product/content owners are able to quickly and easily access user feedback -Product/content owners are able to understand user feedback in order to make improvements -Utilization of Medallia since it is already implemented and provides analytics

Undesired Business Outcomes

-Additional similar processes in different groups -Additional reporting tool creations -User feedback is hard to access and/or understand -The type of feedback collected is not helpful to the product/content owner

Measuring Success

*Utilization of the solution implemented will determine the success of using Medallia vs. the existing good/bad feedback tool.

Objectives and Key results (OKRs)

Empower VFS teams to use customer feedback to improve VA.gov digital products and tools.


Discovery

Assumptions

-Medallia can be enabled/disabled on certain pages -Veterans support could have a separate set of questions or add questions to Medallia and have those presented on the appropriate article pages -Medallia's Intercept Tool allows users to submit feedback only once and will not reappear until 90 days later. We will not be utilizing this tool.

-Create one combined feedback mechanism for Veterans -This makes more sense then having good/bad and Medallia because it is confusing to Veterans as to why there are two surveys and also inefficient from a development, maintenance and KPI's perspective -We have decided to limit the additional feedback responses so that it is quick for the Veterans.

Initiatives

Neither a initiative brief or previous product outline exists for good/bad component as it was intended to be an enduring component as it was a stop gap build based on the fact Medallia rolling on to VA.gov and once that was completed our intention was to revert back to a Medallia like CSAT capture

Go-to-market

Currently the good/bad component rating is available on article pages and the Medallia survey is on the feedback button near the footer on the site. Currently Good/Bad is just that without no supporting information. Medallia has the ability for VA.gov to streamline the survey results to one experience and should be marketed as such.

TBD- This will need to be decided with Chante Lantos as well as Anita Middleton and team to ensure a seamless rollout that enhances the user experience. Additionally when rolled out we will want to make sure we are very clear and concise o the Veteran about what they are responding to i.e an article, page. etc..

Launch Planning

Collaboration Cycle

- Kickoff ticket

Timeline

Describe any major milestones for this initiative including organizational, legislative, etc. constraints.

Initiative Launch Dates


Screenshots

Before

After


Communications

Where will you discuss this initiative? ZenHub

Stakeholders

What offices/departments are critical to make this initiative successful?

- Office/Department: - Contact(s):
UXCamG commented 1 year ago

This thread has drafts of the documents required for the collaboration cycle , design intent, and research. At this time, we are waiting on feedback from the collaboration cycle and Research ops teams.

ATMiddleton commented 1 year ago

Combined with https://app.zenhub.com/workspaces/contact-center-62cdd9546ec1530018209672/issues/gh/department-of-veterans-affairs/va.gov-team/66100

Collab cycle broken into multiple issues.