Closed satra closed 6 years ago
I think this is a function of tests not being skipped, since Circle has more software bundled.
i need to time all the tests. i used to run things with all software installed, and the tests would still run in a reasonable time.
The bedpostx test and the tbss rest are run only in circle (and are long), but they may not account for that large difference.
just downloaded the nipype/nipype:py36
container and ran the test locally:
============= 2893 passed, 54 skipped, 7 xfailed, 16 warnings in 783.33 seconds =============
will now run pytest script exactly.
So Circle does seem to have somewhat underpowered hardware. I guess we could add an entry to the config to print out the CPU and memory info, just to verify. Are there any prospects for reducing runtime apart from disabling tests? Does pytest allow parallelism, or could we hack it to create a pytest command for each file independently and use parallel
to chug through a little faster?
@effigies - i have one more test to do: the FSL_COURSE_DATA was not mounted into the container and resource_monitoring was off.
but if the tests associated with the FSL_COURSE_DATA is indeed the reason, we should not really be doing full scale imaging workflow runs in the pytest modules but rather via the examples. the pytests should be really lightweight and mostly be about covering our code rather than external interactions.
will post soon
yup, the differences come from four tbss tests.
============ 2897 passed, 50 skipped, 7 xfailed, 16 warnings in 2153.62 seconds =============
i'll try to refactor these out into examples.
ok so the regression tests that @oesteban added for the dmri/fsl workflows are indeed what takes longer. i think we should split these tests out of the unittest module. they could be added to smoketests via examples, but more generally at least for workflows we should create additional regression tests.
this would be true of recon-all and others in addition to these. given that these tests take a while to run, we should also be intelligent about when they are tested. we could try to create a system where we figure out which interfaces/workflows are affected by a code change. this could trigger a few or many of these examples to run. this is also what this project is going to partially support: https://github.com/ReproNim/regtests
for the moment, we could skip these tests to reduce the time of at least one of the containers on circle.
@oesteban and @effigies : what do you think?
That seems reasonable.
Looks good 👍
py.test takes 39 minutes on circle and about 8 minutes on travis.