As an ASAP developer, to determine how to best enable different personas with test data for using the automated validations, I would like to know if the reviewers want the ability click a button to generate test data on the fly.
It is clear this will benefit developers, it is not clear if there will be immediate interest for the reviewers themselves. Spike to make a further determination.
Acceptance Critera
[ ] Determination from conversation or other method to understand UX impact and if this is desirable for one, some, or all of:
[ ] Reviewers
[ ] Rule writers
[ ] Tool developers
Story Tasks
[ ] Tasks...
Definition of Done
[ ] Acceptance criteria met - Each user story should meet the acceptance criteria in the description
[ ] Unit test coverage of our code > 90% (from QASP) this may be fuzzy and hard to prove
[ ] Code quality checks passed - Enable html tidy with XML code standards as part of the build (from QASP)
[ ] Accessibility: (from QASP) as we create guidance or documentation and reports (semantic tagging including aria tags): demonstrate with 0 errors reported for WCAG 2.1 AA standards using an automated scanner and 0 errors reported in manual testing
[ ] Code reviewed - Code reviewed by at least one other team members (or developed by a pair)
[ ] Source code merged - Code that’s demoed must be in source control and merged
[ ] Code must successfully build and deploy into staging environment (from QASP): this may evolve from xslt sh pipline into something more
[ ] Security reviewed and reported - Conduct vulnerability and compliance scanning. threat modeling?
[ ] Code submitted must be free of medium- and high-level static and dynamic security vulnerabilities (from QASP)
[ ] Usability tests passed - Each user story should be easy to use by target users (development community? FedRAMP FART team)
[ ] Usability testing and other user research methods must be conducted at regular intervals throughout the development process (not just at the beginning or end). (from QASP)
[ ] Code refactored for clarity - Code must be clean, self-documenting
[ ] No local design debt
[ ] Load/performance tests passed - test data needed - saxon instrumentation
[ ] Documentation generated - update readme or contributing markdown as necessary.
[ ] Architectural Decision Record completed as necessary for significant design choices
Seems there is plenty of more fruitful work in this space, but might not be in the scope of ASAP P4 work particularly. Labeling this as a nice to have stretch goal and keeping in the larger form backlog for now.
Extended Description
As an ASAP developer, to determine how to best enable different personas with test data for using the automated validations, I would like to know if the reviewers want the ability click a button to generate test data on the fly.
It is clear this will benefit developers, it is not clear if there will be immediate interest for the reviewers themselves. Spike to make a further determination.
Acceptance Critera
Story Tasks
Definition of Done