GSA-TTS / federal-website-standards

Federal website standards
https://standards.digital.gov/
Other
7 stars 1 forks source link

Search #155

Open michelle-rago opened 2 months ago

michelle-rago commented 2 months ago
### Tasks
- [x] Review existing research
- [x] Conduct new research if needed
- [x] [Draft standard in Google docs for internal sharing](https://docs.google.com/document/d/1mdRTyrlPZoCsjUfPOrvlT8e-eSd657fX5SpFabYj_7Q/edit?usp=drive_link)
- [x] Share draft with stakeholders
- [ ] Research legal viability of exceptions for applications and tools that won't benefit from a site search
- [ ] Revise as needed
- [ ] Conduct usability testing on the standard
- [ ] Revise as needed
- [ ] Get internal and external approvals to move to pending phase
michelle-rago commented 3 weeks ago

Consider sites have the ability to search sets of content (https://data.census.gov/, https://login.gov/help/) and sites that focus on completing a process (e.g., requesting a .gov) or lookup tools (https://889.smartpay.gsa.gov/) where there would be no benefit to users to add a site search box.

michelle-rago commented 3 weeks ago

Search guidance from M-23:

"Use on-site search functionality: Agencies’ public-facing websites must contain a search function that allows users to easily search content intended for public use. This search function should be a site-wide global search and, when appropriate, could be a feature-specific search for a subset of the website content that is of significant public interest (e.g., find-a-form tool). Agencies should participate in the Search.gov program by utilizing Search.gov for on-site search solutions or by integrating search solutions with Search.gov.

• Design search-engine optimized content: Agencies should ensure that publicly available content (i.e., content that does not require user authentication or sign in) is designed and structured so it can be effectively crawled and indexed by search engines. Agencies must not limit which search engines or crawlers can access or archive their public content. Agencies should employ best practices to improve crawling or indexing of web content, including using sitemaps, robots.txt files,29 and descriptive metadata in commonly parsed fields (e.g., meta element tags). "