This workflow tests that examples and case studies in the gallery execute correctly, and saves any plots produced as artifacts for visual inspection.
It does not test that the documentation builds correctly with Sphinx.
Related issues
Closes #37.
Proposed changes
Run every Python file in examples/ and upload any *.png files as part of an examples ZIP.
Run every Python file in case-studies/ and upload any *.png files as part of a case-studies ZIP.
There's nothing in this PR that makes any attempt to check that the PNGs are correct, because I think the only way to do that is going to be via visual inspection. But the validation will fail if no PNGs are produced.
It may be possible to improve on this further by uploading a separate artifact for each output graph, but I couldn't figure out how to do this with my limited experience of GitHub actions.
I tested this on my own fork, and you can see an example output here.
This workflow tests that examples and case studies in the gallery execute correctly, and saves any plots produced as artifacts for visual inspection.
It does not test that the documentation builds correctly with Sphinx.
Related issues
Closes #37.
Proposed changes
There's nothing in this PR that makes any attempt to check that the PNGs are correct, because I think the only way to do that is going to be via visual inspection. But the validation will fail if no PNGs are produced.
It may be possible to improve on this further by uploading a separate artifact for each output graph, but I couldn't figure out how to do this with my limited experience of GitHub actions.
I tested this on my own fork, and you can see an example output here.