Closed fabian-s closed 2 years ago
Thanks @fabian-s. We now have:
> covr::package_coverage(function_exclusions = '.onLoad')
compareMCMCs Coverage: 89.12%
R/metrics-addMetrics.R: 78.79%
R/MCMCresult.R: 80.77%
R/conversions.R: 81.25%
R/MCMCdef_stan.R: 81.97%
R/MCMCdef_dummy.R: 90.00%
R/renameMCMC.R: 90.48%
R/make_MCMC_comparison_pages.R: 91.19%
R/compareMCMCs.R: 92.25%
R/runNIMBLE.R: 92.75%
R/MCMCdef_jags.R: 93.75%
R/metrics.R: 98.21%
R/MCMCdefs_env.R: 100.00%
R/utils.R: 100.00%
We have gone through details from covr::tally_coverage
and added tests for non-trivial lines that lacked coverage. Most of the remaining gaps are simple lines executed when errors are trapped (e.g. calls to stop
). We marked those with the comment #lacks test coverage
to simplify any future coverage efforts. From our perspective test coverage is now in pretty good shape. Does this look ok for JOSS?
great, thanks!
this could/should be more comprehensive, but I realize it may be hard to write sensible unit tests for results of stochastic algorithms....