In Jan 2022, the Platform Content team introduced the "Drafts space" to the Platform website publishing workflow. ~Three months have elapsed since introduction, and we want to check if the Drafts space bet is fulfilling its goals. (The goals are outlined in https://github.com/department-of-veterans-affairs/va.gov-team/issues/30104 )
HMW evaluate the Drafts space to confirm that authors are delivering more consistent, useable content? (ie, a better reading experience)
Hypothesis or Bet
If we track the measures outlined in the "Measurement" section of this ticket, then we can evaluate if the Drafts space is achieving its goal of delivering a better reading experience.
We will know we're done when... ("Definition of Done")
We have used pre- and post- data to determine if we should keep/discard/modify the Drafts space
Known Blockers/Dependencies
Will be relying on other teams to submit sentiment
Limitations of Scroll Viewport / Confluence in grabbing requisite quant data
Projected Launch Date
End of Spr 78 (May 24)
Launch Checklist
### Guidance (delete before posting)
_This checklist is intended to be used to help answer, "is my Platform initiative ready for launch?". All of the items in this checklist should be completed, with artifacts linked---or have a brief explanation of why they've been skipped---before launching a given Platforminitiative. All links or explanations can be provided in **Required Artifacts** sections. The items that can be skipped are marked as such._
_Keep in mind the distinction between **Product** and **Initiative** --- each Product needs specific supporting documentation, but Initiatives to improve existing Products should reuse existing documentation for that Product. [VSP Product Terminology](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-terminology.md) for details._
### Is this service / tool / feature...
### ... tested?
- [ ] Usability test (_TODO: link_) has been performed, to validate that new changes enable users to do what was intended and that these changes don't worsen quality elsewhere. If usability test isn't relevant for this change, document the reason for skipping it.
- [ ] ... and issues discovered in usability testing have been addressed.
* _Note on skipping: metrics that show the impact of before/after can be a substitute for usability testing._
- [ ] End-to-end [manual QA](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/quality-assurance/README.md) or [UAT](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/research/planning/what-is-uat.md) is complete, to validate there are no high-severity issues before launching
- [ ] _(if applicable)_ New functionality has thorough, automated tests running in CI/CD
### ... documented?
- [ ] New documentation is written pursuant to our [documentation style guide](https://vfs.atlassian.net/wiki/spaces/AP/pages/622264362/Style+guide)
- [ ] Product is included in the [List of VSP Products](https://docs.google.com/spreadsheets/d/1Fn2lD419WE3sTZJtN2Ensrjqaz0jH3WvLaBtn812Wjo/edit#gid=0)
* _List the existing product that this initiative fits within, or add a new product to this list._
- [ ] Internal-facing: there's a [Product Outline](https://vfs.atlassian.net/wiki/spaces/PMCP/pages/1924628490/Product+Outline+Template)
- [ ] External-facing: a [User Guide on Platform Website](https://vfs.atlassian.net/wiki/spaces/AP/pages/1477017691/Platform+website+guidelines) exists for this product/feature tool
- [ ] _(if applicable)_ Post to [#vsp-service-design](https://dsva.slack.com/channels/vsp-service-design) for external communication about this change (e.g. customer-facing meetings)
### ... measurable
- [ ] _(if applicable)_ This change has clearly-defined success metrics, with instrumentation of those analytics where possible, or a reason documented for skipping it.
* For help, see: [Analytics team](https://depo-platform-documentation.scrollhelp.site/analytics-monitoring/Analytics-customer-support-guide.1586823275.html)
- [ ] This change has an accompanying [VSP Initiative Release Plan](https://github.com/department-of-veterans-affairs/va.gov-team/issues/new/choose).
### When you're ready to launch...
- [ ] Conduct a [go/no-go] (https://vfs.atlassian.net/wiki/spaces/AP/pages/1670938648/Platform+Crew+Office+Hours#Go%2FNo-Go) when you're almost ready to launch.
Required Artifacts
### Documentation
* **`PRODUCT_NAME`**: Platform website in [Platform products and services](https://vfs.atlassian.net/l/c/TE5DHPbr) space
* **Product Outline**: https://vfs.atlassian.net/l/c/c3ZrRZqs
* **User Guide**: Contributing to the Platform website [guidelines](https://vfs.atlassian.net/l/c/ZaVJM7pQ)
### Testing
n/a - no testing needed as this initiative is about gathering info, not about creating/iterating on a product
### Measurement
* **Success metrics**:
* Sentiment around publishing
* Volume of changes to the Platform website
* Number of errors picked up by linter
* Sentiment around addressing errors picked up by linter
* **Release plan**: https://github.com/department-of-veterans-affairs/va.gov-team/issues/39101
Problem Statement
In Jan 2022, the Platform Content team introduced the "Drafts space" to the Platform website publishing workflow. ~Three months have elapsed since introduction, and we want to check if the Drafts space bet is fulfilling its goals. (The goals are outlined in https://github.com/department-of-veterans-affairs/va.gov-team/issues/30104 )
HMW evaluate the Drafts space to confirm that authors are delivering more consistent, useable content? (ie, a better reading experience)
Hypothesis or Bet
If we track the measures outlined in the "Measurement" section of this ticket, then we can evaluate if the Drafts space is achieving its goal of delivering a better reading experience.
We will know we're done when... ("Definition of Done")
Known Blockers/Dependencies
Projected Launch Date
End of Spr 78 (May 24)
Launch Checklist
Required Artifacts
TODOs