Closed zandercymatics closed 2 days ago
This ticket will be split into two. One with the documentation additions + the template, the other with just the script.
I also am debating on including those additions or not to begin with. The last 5+ scripts have generally followed the exact same pattern, so I'm thinking this could be helpful.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
🥳 Successfully deployed to developer sandbox za.
Ticket
Resolves #2301
Changes
Context for reviewers
This ticket, more broadly, adds a script that will fill the federal_typefield on FederalAgency.federal_type. This federal_type value is obtained from our DomainInformation records.
To see the run result of this script on our data, visit getgov-litterbox
Broadly, what does the script do? This script does 3 things:
There are two tickets to run this script: https://github.com/cisagov/manage.get.gov/issues/2383 https://github.com/cisagov/manage.get.gov/issues/2382
See this slack thread for more information
Setup
Testing locally or on a sandbox a) on a sandbox: /manage.py transfer_federal_agency_type b) docker-compose exec app ./manage.py transfer_federal_agency_type
Do note that this script assumes that DomainInformation is generally accurate, so if there is incorrect data there for federal agency (fixtures data) then that will percolate. There is one check in place that sees if there is a disparity between multiple records (which tells us the data is wrong), and that is generally effective at weeding that out - but not entirely for fixtures only data.
Verifying that the script works with prod data a) Visit the getgov-litterbox sandbox OR b) Run the import script on a sandbox of your choice as documented in our readmes and run the script
Code Review Verification Steps
As the original developer, I have
Satisfied acceptance criteria and met development standards
Ensured code standards are met (Original Developer)
Validated user-facing changes (if applicable)
As a code reviewer, I have
Reviewed, tested, and left feedback about the changes
Ensured code standards are met (Code reviewer)
Validated user-facing changes as a developer
[ ] New pages have been added to .pa11yci file so that they will be tested with our automated accessibility testing
[ ] Checked keyboard navigability
[ ] Meets all designs and user flows provided by design/product
[ ] Tested general usability, landmarks, page header structure, and links with a screen reader (such as Voiceover or ANDI)
[ ] Tested with multiple browsers, the suggestion is to use ones that the developer didn't (check off which ones were used)
[ ] (Rarely needed) Tested as both an analyst and applicant user
Note: Multiple code reviewers can share the checklists above, a second reviewers should not make a duplicate checklist
As a designer reviewer, I have
Verified that the changes match the design intention
Validated user-facing changes as a designer
[ ] Checked keyboard navigability
[ ] Tested general usability, landmarks, page header structure, and links with a screen reader (such as Voiceover or ANDI)
[ ] Tested with multiple browsers (check off which ones were used)
[ ] (Rarely needed) Tested as both an analyst and applicant user
Screenshots