To validate the output of the JWST pipeline, files, figures and other output will need to be inspected by eye. To streamline the effort we suggest a continuous integration approach with a Jupyter notebook based workflow. This effort will capture documentation, code, figures and testing logs to make a 'one stop shop' for the validation testing effort. The notebook design will be flexible enough for JWST instrument teams to make necessary changes to achieve their testing goals. The goal is to build off of the STScI notebooks initiative and to eventually integrate the validation notebooks into the greater notebooks repository that is available to the public.
If human eye is necessary, format tests in notebook and place them in JWST Validation repo. If using pytest, needs to be inside of JWST unit or regression testing suite.
Key thing, workflow. Design integration steps from JWST Validation Notebook repo to public notebook repo that isn't painful (i.e. change crds file calls to astroquery calls). JWST Validation Tests Notebooks repository will be seperate to begin with and make sure it doesn't diverge from the main notebooks repo. Make sure underlying machinery is useable (html generation from notebook etc)
Currently the noteook repo is computationally slow because of travis acting as an outside user (feature not a bug). It will be worse for the pipeline with files. Travis solution wont work.
Because the validation testing will be internal, we can use CRDS but it then won't be publically available. First pass, use artifactory and make it internally facing. Long term, use box or some hosting software and make it external facing. Use abstraction layer Steve mentioned that will make it easy to switch between software without having to change source code.
Open ticket on how to retrieve data in the future.
For validation notebook repo, use Jenkins inside network to generate nice reports and use CI Watson to fetch the data.
Notebooks currently are served via github pages, would we use artifactory to do the same for JWST Validation repo? Use same backend to convert the notebook to HTML and then deside where to dump them out to where a user can access it via a web browser and not have to download an index.html. Maybe have to spin up a short python script to serve the webpages? (nothing to painful and easy to maintain, should be ~10 lines of code).
We should implement feature for test notebooks to run only on tests that will be affected by changes. Other features to skip because notebook is too slow or they aren't currently working as well.
Hopefully this captures the important stuff. Feel free to mention other things that I missed!
Issue JP-512 was created on JIRA by Mees Fix:
To validate the output of the JWST pipeline, files, figures and other output will need to be inspected by eye. To streamline the effort we suggest a continuous integration approach with a Jupyter notebook based workflow. This effort will capture documentation, code, figures and testing logs to make a 'one stop shop' for the validation testing effort. The notebook design will be flexible enough for JWST instrument teams to make necessary changes to achieve their testing goals. The goal is to build off of the STScI notebooks initiative and to eventually integrate the validation notebooks into the greater notebooks repository that is available to the public.
Notebooks Repository:
https://github.com/spacetelescope/notebooks
Notebook Style-Guide Repository:
https://github.com/spacetelescope/style-guides
JWST Validation Notebooks Repository:
https://github.com/spacetelescope/jwst_validation_notebooks