Open yarikoptic opened 3 years ago
@yarikoptic More specific ideas:
We write scripts for generating the following:
Each script also generates alongside each NWB file a provenance record/sidecar file listing the package versions used to create the file (See #6).
The scripts should probably also read the NWB files back in after writing to ensure that they contain what they should as a sanity test.
We compile a list of sets of pynwb, hdmf, and extension versions to use to generate the sample NWBs (See #5).
We write & run a script that creates a virtualenv for each version set, installs the given versions of the packages in the venv, and then runs the appropriate NWB-generating script(s) in the venv to produce NWBs generated by all of the package combinations.
Assuming that the sample-production environments are parametrized by nothing more than a pynwb version, an hmdf version, and at most one extension + extension version, we could arrange the layout of the sample repository like so:
pynwb-{version}/
hmdf-{version}/
core/
# Sample NWBs using the given pynwb & hmdf versions and no extensions
{extension1}-{version1}/
# Sample NWBs using the given pynwb & hmdf versions and the given version of the given extension
{extension1}-{version2}/
{extension2}-{version1}/
# etc.
We write tests that iterate through the sample NWBs and test that they can be loaded successfully and hold the expected data.
pytest.importorskip()
(assuming it's applicable to pynwb extensions) to avoid being run if the relevant extension isn't installed, and then each extension-specific test could be parametrized by the extension-specific NWB files (either a hardcoded list or generated via filepath iteration & provenance inspection).We compile a list of sets of pynwb, hmdf, and extension versions to test the sample NWBs against (See #5).
We use Jinja2 templates to generate GitHub Actions workflows from the version sets (one workflow per extension) that install the specified package combinations and then run the tests against the relevant sample NWBs.
We use Jinja2 templates to generate a README showing a grid of CI status badges for each workflow.
@yarikoptic Possible alternative way of structuring the production & testing code that may have been more in line with what you were getting at in #2:
copy/pasted from the original hackathon project page