Open hadim opened 7 years ago
Here is my plan for the imagej/tutorials
repository:
All tutorial notebooks use a version of imagej/imagej
(e.g., 2.0.0-rc-58
as of this writing), which ensures that that version as a whole has components which play nicely together.
There is also a plan for validating different versions of components against one another. It is called the "melting pot" and the script for it can be seen here. The idea is to clone a bunch of Git repositories and union them into a big multi-module Maven project, with the modules coupled against each other rather than against any external JAR dependencies. Then build the whole thing, and run the unit tests. This will help a lot to detect breakages due to dependency skew. Essentially, it will help ensure that the Bill of Materials promised in pom-scijava
does indeed present compatible versions of everything.
I do not have a plan for testing the scripts in this repository. One option would be to run them from the CLI and check the outputs. But this assumes they are headless-friendly, and/or produce meaningful outputs which can be tested against non-graphically.
Seems like a nice and ambitious plan !
About this repo, there is already tests for some scripts : https://github.com/imagej/imagej-scripting/blob/master/src/test/java/net/imagej/scripting/ImageJ2ScriptTest.java
We only check if the script run without exception most of the time but I think it's already a good test.
So it's just a matter of triggering the tests when you are releasing a new component such as imagej
. Actually it could be done at the same time from the CLI when you run the notebooks.
What do you think ?
We only check if the script run without exception most of the time but I think it's already a good test.
Best would be to test that the script outputs match expected values. Right now I see that we do check outputs, but only that they are not null. Is there a problem with asserting the actual expected value there in those tests?
So it's just a matter of triggering the tests when you are releasing a new component such as imagej. Actually it could be done at the same time from the CLI when you run the notebooks.
Yes, if we put imagej-scripting
into the melting-pot I mentioned, it will check all the tests that way. Good! :+1:
I don't see any problem to check for correct output.
@hadim I improved the tests to verify actual sample values now. Not all of them, but enough.
It is probably an issue though that all of the ImageJ2ScriptTest
tests return images with all zeroes. Sorry I don't have time to look into it.
I'll try to have a look when I have time.
Sure, no rush. I'd rather push on the scijava-jupyter-kernel, since I'll need it next month. 😁 These scripts will keep till later.
A good practice will be to not merge new PRs without tests in it.
Some tutorials are broken because of the changes in ops signature.
See this one for example : https://github.com/imagej/imagej-scripting/blob/master/src/main/resources/script_templates/Tutorials/Simple_Convolution.groovy
We should have tests for every script (when possible) but not only. We also should find a way to run the tests not only when new version of
imagej-scripting
is released but also when other critical IJ components are released such as imagej-ops, imagej, imglib2, scijava, etcPing @imagejan @ctrueden