cal-itp / reports

GTFS data quality reports for California transit providers
https://reports.calitp.org
GNU Affero General Public License v3.0
7 stars 0 forks source link

Release December reports.calitp.org #57

Closed holly-g closed 2 years ago

holly-g commented 2 years ago
Nkdiaz commented 2 years ago

On going documentation for end-to-end generation of December reports with various checks

Creating report data

pip install -r requirements
make generate_parameters
make all -j 30

Generate static site

npm run build
python -m http.server

Review

Deploy Reports

to push report data to the production bucket

make sync-prod

or to copy from the dev to prod

gsutil -m rsync -r -d gs://gtfs-data-test/report_gtfs_schedule/ gs://gtfs-data/report_gtfs_schedule/

If there are no changes between development and production rerun the last github action workflow run on main.

Email reports

Testing

obtain test emails from evan. Verify with Olivia and calitp that email contents are correct. Update the config file

Production

Once emails have passed visual inspection, change config file for production

python 3_generate_report_emails.py production 

pass prompt that asks if production is correct and verify email recipients are production send out emails

Verify emails successfully sent

themightychris commented 2 years ago

It's an odd flow that re-running the last github action on main after pushing new reports data to the prod bucket is the way to get prod updated with the new report data. A minor improvement might be to augment the action to also support a workflow_dispatch trigger with an env input

evansiroky commented 2 years ago

It's an odd flow that re-running the last github action on main after pushing new reports data to the prod bucket is the way to get prod updated with the new report data. A minor improvement might be to augment the action to also support a workflow_dispatch trigger with an env input

I've wondered if we even need GitHub actions for deployment. Maybe I'm living in the past, but a process where local development and deployment from the local machine occurs seems like it could be quicker than waiting ~1 hour to upload all the raw notebook data (ok maybe not an hour if you're only updating only 1 month) data and another ~10 minutes for a GitHub action to complete since it then needs to download all that raw data. Also, it's also multi-step deployment process that is prone to human error and distraction. If it's possible to upload just minified HTML/JS/CSS directly from a local machine, it seems like that could be much faster.

See https://github.com/cal-itp/reports/issues/66 as a place to continue the discussion.

Nkdiaz commented 2 years ago

December reports were sent out on 12-12-21