pynbody / genetIC

The GM initial conditions generator
GNU General Public License v3.0
21 stars 8 forks source link

Tests are not running by default and fail when enabled #27

Closed apontzen closed 6 years ago

apontzen commented 6 years ago

@Martin-Rey I uncovered a pretty serious bug today in the tests! In commit 7baec09ce670555 you changed compare.py to ask whether a tipsy file actually exists. However the code is broken and always returns False (you can't use wildcards in os.path.exists). As a result, the particle data has not been compared since that commit.

When I re-enabled this I get a failure on test 02g. Could you investigate and issue a PR to repair compare.py and fix the failing unit test?

Martin-Rey commented 6 years ago

Yes I will take a look. Is it urgent or can it wait for next week ?

apontzen commented 6 years ago

From my perspective it can wait, although the failing unit test might seem a bit alarming if you are doing runs...

Martin-Rey commented 6 years ago

Yes I will do this sooner rather than later

Martin-Rey commented 6 years ago

I see the problem with wildcard and os.exist() but I can't reproduce the error on test_02g. This test only calculates the value on multiple levels, the value can be manually crosschecked between the printed value in output.txt and check_variance.txt. Are those value different ?

The line with os.exists was added to avoid test failing if no tipsy file are generated, for example if the test output is grafic. I will find a way to properly take care of this.

apontzen commented 6 years ago

I'll wait til you have a fix for this, maybe 2g only seemed to fail because of the way I forced it to run.

apontzen commented 6 years ago

I'm still seeing failure on test 2g. It also fails in this travis build on my commit 526a0e96 (branch gadget-multimass)

https://travis-ci.com/ucl-cosmoparticles/genetIC/builds/66721488

I'm unclear about why this is happening, as everything passes if I revert to e048f03. But I can't see that I should have changed anything you're doing in that test.

On a related note, I can't actually quite see what the test does. It seems to output the variance in a specified region, but that is never actually tested so even if the output went wrong I think the tests would still pass?

Any insight would be much appreciated!

Martin-Rey commented 6 years ago

I think I designed this test when manually calculating the variance. I forgot that the actual variance value was never checked but it should still prove that asking for overdensity/variance calculations should not modify the final grid. I agree this is not critical and could be taken out to speed up running the tests.

I don't have the time to look at the details now but two quick propositions:

I would first manually crosscheck the variance value between output.txt and check_variance.txt to see if the problem if from the variance part of the code or not.

apontzen commented 6 years ago

OK, this prompted me to look at #26 which looks good modulo my minor comment.

Do I understand correctly from your comments that #26 also includes #25 which we can therefore close without merging?

I will wait til I can merge this in before looking again at the failing test. I think you're correct that it's something about the mapper rather than the variance code.

Martin-Rey commented 6 years ago

Yes #26 (refactoring-mask) has #25(grafic units) inside it. #25 can be closed without merging, once #26 is merged.

I will take a look at the gadget PR to see if I can spot anything