The regression tests implemented in aeolis/tests/regression_tests/ checks whether whether running the simulation for the following cases produces the same netCDF file consistently across other code changes in the repository
1D/case1_small_waves
1D/case2_larger_waves
1D/case3_erosion_avalanching
2D/Barchan_dune
The testcases include
check whether netCDF file is created as part of the simulation
check whether aeolis.log file created as part of the simulation
check whether the array shape, dimension, and array values in the netCDF file produced are the same as the ones stored in a reference output for the same model configuration file.
Currently, the pytest output doesn't show the pass/fail status of testcase for each of the cases making it difficult to understand the test output and debug it in case of a failure.
Desired behavior
Display pass/fail status of each testcase per scenario in the pytest output.
Fix
The desired behavior can be achieved by breaking the large test into individual testcases and using parametrization.
Current behavior
The regression tests implemented in
aeolis/tests/regression_tests/
checks whether whether running the simulation for the following cases produces the same netCDF file consistently across other code changes in the repositoryThe testcases include
Currently, the pytest output doesn't show the pass/fail status of testcase for each of the cases making it difficult to understand the test output and debug it in case of a failure.
Desired behavior
Display pass/fail status of each testcase per scenario in the pytest output.
Fix
pytest -v
on the command line.