pyiron / pyiron_continuum

pyiron_continuum - additional modules for the pyiron IDE supporting continuum scale simulations and workflows
BSD 3-Clause "New" or "Revised" License
3 stars 5 forks source link

Coverage too low #310

Open samwaseda opened 1 month ago

samwaseda commented 1 month ago

Currently the coverage stands somewhere around 50 - 60%. I guess there are the following reasons:

I personally would love to aim at something like 80%. Let's give it a try!!

liamhuber commented 1 month ago

Schrödinger (64%): @liamhuber @raynol-dsouza (if you don't have time leave it like this; I find it important there's also Schrödinger)

I don't have time, but I agree it's nice to keep this code in the repo. One shortcut to get this coverage boosted would be to go look in the demo notebook for this tool and copy-paste all or parts of it into tests/integration/schroedinger.py. This doesn't give any real benefit to us since we run notebooks as part of our tests and thus would already get alerted if it starts failing, but it would allow us to get credit for this awareness by letting the coverage see it.

It's not as nice as real unit tests, but it's not abhorrible. It doubles the computational cost too, but for this particular package I think that's probably trivial.

Automatic population of Creator? Maybe @liamhuber ?

Do whatever you want to the creator. This was QoL platform/framework/syntax thing, and while I use a similar concept to make stuff available in the pyiron_workflow framework, I don't care about the particular implementation here surviving. Keep it or destroy it as you see fit and most productive.

samwaseda commented 1 month ago

This doesn't give any real benefit to us since we run notebooks as part of our tests and thus would already get alerted if it starts failing

That means in the centralised one it runs regardless of whether integration is set? That might be a bit problematic because Damask notebooks take a really long time.

liamhuber commented 1 month ago

No, I don't mean to run the notebooks as part of coverage, I mean to include tests/integration in the coverage test run. For damask that folder currently only has damask/test_backwards_compatibility.py, which I hope is not particularly pricey. I really mean to copy-paste code from the notebook into a new .py file in the integration tests and run that, that's all.

I was a little bit wrong about schroedinger being cheap though -- the 3D example(I think?) is taking forever in the notebooks test. So I still am on board with this line of attack, but you'll definitely need to copy-paste only a sub-selection of the notebook content to keep runtime up.

liamhuber commented 1 month ago

I was a little bit wrong about schroedinger being cheap though -- the 3D example(I think?) is taking forever in the notebooks test.

Actually, maybe something has simply gone wrong. [Your PR]() only took 12 minutes for the entire notebooks test, and mine is hanging on the schroedinger file for at least five minutes now with a total runtime of 22 minutes...

liamhuber commented 1 month ago

I was a little bit wrong about schroedinger being cheap though -- the 3D example(I think?) is taking forever in the notebooks test.

Actually, maybe something has simply gone wrong. Your PR only took 12 minutes for the entire notebooks test, and mine is hanging on the schroedinger file for at least five minutes now with a total runtime of 22 minutes...

Ok, nevermind, there must have been a hiccup on github's side -- running the tests again the schroedinger notebook finished in just a few seconds and my total runtime is comparable to yours. So you even could directly convert the notebook to a py file and slap it in the integration folder.

samwaseda commented 1 month ago

No, I don't mean to run the notebooks as part of coverage, I mean to include tests/integration in the coverage test run. For damask that folder currently only has damask/test_backwards_compatibility.py, which I hope is not particularly pricey.

It's not too pricey, but for small PR's I would probably only want to run the unit tests, so I think it would be good to have a somewhat complete set of unit tests in the end.