ImperialCollegeLondon / pyrealm

Development of the pyrealm package, providing an integrated toolbox for modelling plant productivity, growth and demography using Python.
https://pyrealm.readthedocs.io/
MIT License
19 stars 8 forks source link

Add unit tests for splash #170

Closed tztsai closed 7 months ago

tztsai commented 7 months ago

Description

This pull request introduces a unit testing suite for the SplashModel class in the test_splash.py file. The main changes include:

  1. New Test Cases: Added new test cases for the estimate_daily_water_balance, estimate_initial_soil_moisture, and calc_soil_moisture methods.

  2. Randomized Testing: Introduced randomized testing by randomly generating dates and wn_init. This ensures that the tests cover a wide range of possible scenarios and increases the robustness of the test suite.

Fixes #162, #175

Type of change

Key checklist

Further checks

tztsai commented 7 months ago

Tests are failing and need fixing.

Some tests are checking whether SplashModel raises errors when some input variables are out of bound, as suggested in the last meeting. Since some boundaries checks have not been implemented, these tests are failing.

MarionBWeinzierl commented 7 months ago

Tests are failing and need fixing.

Some tests are checking whether SplashModel raises errors when some input variables are out of bound, as suggested in the last meeting. Since some boundaries checks have not been implemented, these tests are failing.

In this case, the test needs to check that a specific error is raised, and succeed if this error is raised. Have a look at, for example, https://github.com/ImperialCollegeLondon/pyrealm/blob/develop/tests/unit/pmodel/test_fast_slow_scaler.py how this works.

tztsai commented 7 months ago

Tests are failing and need fixing.

Some tests are checking whether SplashModel raises errors when some input variables are out of bound, as suggested in the last meeting. Since some boundaries checks have not been implemented, these tests are failing.

In this case, the test needs to check that a specific error is raised, and succeed if this error is raised. Have a look at, for example, https://github.com/ImperialCollegeLondon/pyrealm/blob/develop/tests/unit/pmodel/test_fast_slow_scaler.py how this works.

Thanks for the link, but I have already implemented them... I meant SplashModel has not implemented the boundary checks and does not raise the expected errors when some input variables are out of bound, and that's why some unit tests do not pass.

MarionBWeinzierl commented 7 months ago

Tests are failing and need fixing.

Some tests are checking whether SplashModel raises errors when some input variables are out of bound, as suggested in the last meeting. Since some boundaries checks have not been implemented, these tests are failing.

In this case, the test needs to check that a specific error is raised, and succeed if this error is raised. Have a look at, for example, https://github.com/ImperialCollegeLondon/pyrealm/blob/develop/tests/unit/pmodel/test_fast_slow_scaler.py how this works.

Thanks for the link, but I have already implemented them... I meant SplashModel has not implemented the boundary checks and does not raise the expected errors when some input variables are out of bound, and that's why some unit tests do not pass.

OK, then you should raise a new issue for this, and ideally implement those checks that you need in SPLASH, so that the checks pass.

davidorme commented 7 months ago

I'm also happy for a PR to include test-lead development of the code, so we could add the modifications to SplashModel here. Unless that is bad practice, @MarionBWeinzierl?

MarionBWeinzierl commented 7 months ago

I'm also happy for a PR to include test-lead development of the code, so we could add the modifications to SplashModel here. Unless that is bad practice, @MarionBWeinzierl?

Hm, one could argue that it would be against a very clear separation of concerns in issues (PR is about adding unit tests). But I am not opposed to what you suggest, if you are happy with it. The question would still be, do you (@davidorme ) want

  1. an additional issue which is closed by this same PR?
  2. this work to be included in #162 ?
  3. or this work to be done "on the fly" for this PR without an issue?
davidorme commented 7 months ago

Let's do it properly and raise a new issue.

tztsai commented 7 months ago

I've added boundary checks in the Splash module. If necessary I can split it into a separate PR.

codecov-commenter commented 7 months ago

Codecov Report

Attention: 1 lines in your changes are missing coverage. Please review.

Comparison is base (3014744) 93.83% compared to head (3b8e506) 93.99%.

Files Patch % Lines
pyrealm/splash/splash.py 93.75% 1 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## develop #170 +/- ## =========================================== + Coverage 93.83% 93.99% +0.16% =========================================== Files 28 28 Lines 1395 1399 +4 =========================================== + Hits 1309 1315 +6 + Misses 86 84 -2 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

MarionBWeinzierl commented 7 months ago

I've added boundary checks in the Splash module. If necessary I can split it into a separate PR.

We do not need a new PR. A new issue explaining the problem and the necessary changes, which could be referenced in this PR, would be good.