Closed eelzi-ONRR closed 11 months ago
I attended the Mobile/SEO session, below is the majority of the discussion:
Current way OCIO captures Mobile access:
What should they be scanning for Mobile/SEO experience? Possible indicator categories:
Notes from the Webpage Feedback session today:
• Replacing "Customer" to "Webpage" Feedback
• Webpage Feedback Indicators Intro:
○ Example: "Is this page useful?" with links to yes or no. And link to report a problem.
○ Seeks to understand webpage-level accessibility, content, and security issues.
○ Standardized government-wide.
○ Infor will help agencies prioritize top websites and justify the need for resources.
○ Does not replace the A-11 CX post-transaction customer survey requirements
• Business Questions:
○ Use case for oversight
§ "Which websites from X agency have received the most positive/negative feedback in the past week?"
○ Use case for implementation:
§ "What mix of positive/negative feedback has my website received in the past week?"
§ Which URLs received this feedback?"
§ "How many thumbs up/thumbs down?"
• What would be appropriate indicators for website content (usefulness, user satisfaction, etc.) collected at the webpage level?
○ Tricky to get feedback from humans - getting human help is very important.
○ Need to make sure questions are customized and not conflicting/confusing with A-11 customer feedback requirements. Make sure they are more web-specific.
○ Clarity between website feedback versus webpage feedback - these are different use cases.
○ Findability of content - ensure users can find what they're looking for - good search functionality.
○ Online data analysis tools - "did you accomplish your task?" general feedback
○ Clarity of the content, use of plain language, avoidance of governance speak
○ Dealing with outliers - how?
○ International vs domestic audiences - extreme distrust in government and negative feedback- does one size fits all for data collection even make sense?
○ Google Journey functionality - gov-wide tracking between .gov sites (Ex: SSA-Medicare-unemployment)
○ Incorporate this tool into usability testing - make sure it's in the right place and users are seeing/using it
○ Track fed employee experience as well
○ Making sure open-ended question options are available
○ Multilanguage within the survey
• How are agencies currently collecting the type of feedback?
○ High impact services utilize the new requirements for customer experience survey (A-11) very clear - using this model for other pages as well
§ Weird issue of having HISP and webpage feedback stacked - could be confusing
○ Farmers.gov - blue feedback button - 3 broad categories:
§ tech issue - routes to IT contacts
§ content on site
§ general federal comments not specifically related to site - routes to main USDA website
○ General survey with 1-5 star rating and one question
○ SSA - was the page helpful: y/n, why (multiple choices), how can we improve. Helpfulness rating for every page.
○ CDC - opt-in survey - qualitative - audience, usefulness, trustworthiness. If low ratings, asks why.
○ Census - important to be flexible enough to know about trends - combating disinformation. Large educational component to their feedback. Using AI to help process and combat disinformation. Need time and place context to make sure there are no erroneous interpretations.
○ Ag- open-ended questions led to significant changes
○ Feedback does not replace user research
@eelzi-ONRR @lpgoldstein I took notes on the first accessibility session, but not sure if the session in general was useful. Let me know if I should post them anywhere on Sharepoint.