ImperialCollegeLondon / pyrealm

Development of the pyrealm package, providing an integrated toolbox for modelling plant productivity, growth and demography using Python.
https://pyrealm.readthedocs.io/
MIT License
19 stars 8 forks source link

Add unit tests #153

Closed surbhigoel77 closed 5 months ago

surbhigoel77 commented 9 months ago

Description

Fixes Issue #23 Fixes Issue #191

In this PR, we are adding new unit tests for the sub-modules in pmodel that do not have dedicated unit tests. These are the submodules that we are writing tests:

Current work in progress is for optimal_chi has following testcases:

Current work in progress is for jmax_limitation has following testcases:

Current work in progress is for pmodel_environment has following testcases:

Type of change

Key checklist

Further checks

surbhigoel77 commented 9 months ago

Current setup for pmodel submodules with their unit tests:

  1. competition -> test_c3c4competition
  2. functions -> test_functions
  3. isotopes -> test_calc_carbon_isotopes
  4. subdaily -> test_memory_effect, test_fast_slow_pmodel, FastSlowPModel_JAMES (UT not available)
  5. fast_slow_scaler -> test_fast_slow_scaler
  6. cal_optimal_chi -> UT not available
  7. jmax_limitation -> UT not available
  8. pmodel_environment -> UT not available

We can add tests for the submodules for which UTs are not available.

davidorme commented 9 months ago

@surbhigoel77 I'm not quite sure what to review here? I only see the one line.

codecov-commenter commented 8 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 95.23%. Comparing base (7bf55ac) to head (1ff0456). Report is 44 commits behind head on develop.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## develop #153 +/- ## =========================================== + Coverage 95.17% 95.23% +0.05% =========================================== Files 28 28 Lines 1701 1701 =========================================== + Hits 1619 1620 +1 + Misses 82 81 -1 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

surbhigoel77 commented 6 months ago

@davidorme could you review the structure of the test_optimal_chi and also highlight the bounds applicable to the input or output?

surbhigoel77 commented 6 months ago

LGTM - very clean. There does seem to be duplication between the two files - is one going to be retired?

@davidorme thanks for pointing out. The one with the name test_calc_optimal_chi is the outdated one and shall be removed.

surbhigoel77 commented 6 months ago

@davidorme Could you share the soft-bound values for the inputs to OptimalChi? I will include a unit testcase for the module based on these bounds.

davidorme commented 6 months ago

@davidorme Could you share the soft-bound values for the inputs to OptimalChi? I will include a unit testcase for the module based on these bounds.

Hi @surbhigoel77 - so the rpmodel regression tests generate a bunch of input values to PModelEnvironment that cover a wide range of plausible input conditions, given in the ranges here:

https://github.com/ImperialCollegeLondon/pyrealm/blob/1f60bdba59367590fc94d3dbd2c52f579527b513/pyrealm_build_data/rpmodel/generate_test_inputs.py#L38-L48

If I run those through PModelEnvironment then we get the following ranges for the "photosynthetic environment" variables (ca, kmm, gammastar, ns_star):

import json
import numpy as np
from pyrealm.pmodel import PModelEnvironment

data = json.load(open('../rpmodel/test_inputs.json'))
env = PModelEnvironment(
    tc=np.array(data['tc_ar']), 
    co2=np.array(data['co2_ar']), 
    patm=np.array(data['patm_ar']), 
    vpd=np.array(data['vpd_ar'])
)

And hence

In [10]: env.kmm.min(), env.kmm.max()
Out[10]: (0.7300629340136395, 588.8274005285591)
In [11]: env.ca.min(), env.ca.max()
Out[11]: (15.363050478271433, 52.860853517068094)
In [12]: env.gammastar.min(), env.gammastar.max()
Out[12]: (0.133689580651389, 14.45454199349665)
In [13]: env.ns_star.min(), env.ns_star.max()
Out[13]: (0.6162019533770273, 6.347157204305028)

So broadly, kmm is $(0,1000)$, ca is $(0, 100)$, gammastar is $(0, 30)$ and ns_star is $(0, 10)$. Does that help?

Might be worth using that actual test dataset - would be a convenient standard if you need one and that way if we need to update ranges, it gets cascaded across multiple tests.

surbhigoel77 commented 6 months ago

Thanks for sharing this @davidorme. Yes, using the actual test set is a decent idea.

But, what is the ideal response to crossing the soft bounds? Is there a different warning message that should pop up?

surbhigoel77 commented 5 months ago

@davidorme could you review this PR?

There are 3 files to review as mentioned in the PR description. A couple of special notes:

  1. test_OptimalChi has a testcase for nan value check but is commented out as it would not let the github checks pass the file. The testcase gives assertion error.
  2. test_pmodelenvironment has the testcases to check for photosynthesis variables' plausible out of bound values. I have used the available test dataset (tc, patm, co2) to verify the same and none of the variables (kmm, ns_star, gammastar, ca) have any out-of-bound values. (These testcases are linked to #191 )
surbhigoel77 commented 5 months ago

@davidorme Could you approve the changes that you requested? I will only be able to merge once you approve them. A new PR is to be opened by me for putting imports within testcases