Open njtierney opened 1 year ago
Thanks for trying out the package!
Just to verify, testthat/covr are passing wo problems indep of covrpage?
ah they are not passing tests! Is that a requirement, or is there an option I can pass along to allow failing tests?
The package tries to get around failing tests by commenting out offending lines in the test dir.
From previous issues usually something is very broken down in pkg tests for it not to be able to get around failures.
I can fork your repo and try to see why it is struggling
OK! It might be a bit of admin/work installing the dependencies for greta
, so no worries if it ends up being a bit much!
I just finished refactoring this package and used greta as one of my hard cases to see if the refactor was successful... i checked out your PR and ran it on it and it handeled it great.
I dont have write privlieges to greta so i cant commit to your PR . here is the md output.
23 December, 2022 14:18:41
This output is created by covrpage.
Coverage summary is created using the covr package.
## โ ๏ธ Not All Tests Passed
## Coverage statistics are approximations of the non-failing tests.
## Use with caution
##
## For further investigation check in testthat summary tables.
Object | Coverage (%) |
---|---|
greta | 1.23 |
R/as_data.R | 0.00 |
R/calculate.R | 0.00 |
R/callbacks.R | 0.00 |
R/chol2symm.R | 0.00 |
R/conda_greta_env.R | 0.00 |
R/dag_class.R | 0.00 |
R/distribution.R | 0.00 |
R/extract_replace_combine.R | 0.00 |
R/functions.R | 0.00 |
R/greta_array_class.R | 0.00 |
R/greta_create_conda_env.R | 0.00 |
R/greta_install_miniconda.R | 0.00 |
R/greta_install_python_deps.R | 0.00 |
R/greta_mcmc_list.R | 0.00 |
R/greta_model_class.R | 0.00 |
R/greta_stash.R | 0.00 |
R/inference_class.R | 0.00 |
R/inference.R | 0.00 |
R/install_greta_deps.R | 0.00 |
R/joint.R | 0.00 |
R/mixture.R | 0.00 |
R/new_install_process.R | 0.00 |
R/node_class.R | 0.00 |
R/node_types.R | 0.00 |
R/operators.R | 0.00 |
R/optimisers.R | 0.00 |
R/probability_distributions.R | 0.00 |
R/progress_bar.R | 0.00 |
R/reinstallers.R | 0.00 |
R/samplers.R | 0.00 |
R/simulate.R | 0.00 |
R/structures.R | 0.00 |
R/testthat-helpers.R | 0.00 |
R/tf_functions.R | 0.00 |
R/transforms.R | 0.00 |
R/unknowns_class.R | 0.00 |
R/variable.R | 0.00 |
R/zzz.R | 0.00 |
R/utils.R | 4.60 |
R/checkers.R | 10.73 |
R/test_if_forked_cluster.R | 22.22 |
Unit Test summary is created using the testthat package.
file | n | time | error | failed | skipped | warning | icon |
---|---|---|---|---|---|---|---|
test_as_data.R | 2 | 0.299 | 0 | 0 | 2 | 0 | ๐ถ |
test_calculate.R | 21 | 2.374 | 0 | 0 | 21 | 0 | ๐ถ |
test_check_tf_installed.R | 3 | 0.426 | 0 | 0 | 3 | 0 | ๐ถ |
test_distributions.R | 44 | 4.727 | 0 | 0 | 44 | 0 | ๐ถ |
test_extract_replace_combine.R | 20 | 2.328 | 0 | 0 | 20 | 0 | ๐ถ |
test_functions.R | 22 | 2.484 | 0 | 0 | 22 | 0 | ๐ถ |
test_future.R | 4 | 3.771 | 0 | 2 | 0 | 0 | ๐ |
test_greta_array_class.R | 2 | 0.415 | 0 | 0 | 2 | 0 | ๐ถ |
test_greta_mcmc_list_class.R | 3 | 0.367 | 0 | 0 | 3 | 0 | ๐ถ |
test_head_tail_work.R | 1 | 0.103 | 0 | 0 | 1 | 0 | ๐ถ |
test_iid_samples.R | 6 | 0.765 | 0 | 0 | 6 | 0 | ๐ถ |
test_inference.R | 24 | 2.995 | 0 | 0 | 24 | 0 | ๐ถ |
test_install_greta_deps.R | 1 | 0.218 | 0 | 0 | 1 | 0 | ๐ถ |
test_joint.R | 13 | 1.416 | 0 | 0 | 13 | 0 | ๐ถ |
test_misc.R | 16 | 1.440 | 0 | 0 | 12 | 0 | ๐ถ |
test_mixture.R | 12 | 1.418 | 0 | 0 | 12 | 0 | ๐ถ |
test_operators.R | 6 | 0.798 | 0 | 0 | 6 | 0 | ๐ถ |
test_opt.R | 4 | 0.421 | 0 | 0 | 4 | 0 | ๐ถ |
test_posteriors.R | 7 | 0.894 | 0 | 0 | 7 | 0 | ๐ถ |
test_representations.R | 8 | 0.855 | 0 | 0 | 8 | 0 | ๐ถ |
test_simulate.R | 5 | 0.536 | 0 | 0 | 5 | 0 | ๐ถ |
test_syntax.R | 4 | 0.572 | 0 | 0 | 4 | 0 | ๐ถ |
test_transforms.R | 2 | 0.213 | 0 | 0 | 2 | 0 | ๐ถ |
test_truncated.R | 15 | 1.750 | 0 | 0 | 15 | 0 | ๐ถ |
test_variables.R | 8 | 1.022 | 0 | 0 | 8 | 0 | ๐ถ |
test-diagrammer-installed.R | 1 | 0.115 | 0 | 0 | 1 | 0 | ๐ถ |
test-message_if_using_gpu.R | 5 | 0.687 | 4 | 0 | 4 | 0 | ๐ถ |
test-tensorflow-rpkg-stability.R | 6 | 0.761 | 0 | 0 | 6 | 0 | ๐ถ |
Hi @yonicd !
Thank you very much for your help here, I really appreciate it! This has been really helpful in other packages I develop.
Alas, I still get this error:
Warning messages:
1: closing unused connection 7 (<-localhost:11642)
2: closing unused connection 6 (<-localhost:11642)
Looking at your tests above, it looks like a lot of the tests were skipped because the TF dependency wasn't installed - so perhaps I'm running into this issue because of some TF problem?
No worries if this is looking too complicated to solve, it's a super useful package that I'm really enjoying using in my other software :) so please feel free to close this issue
Just to check something on your end. Is it winos or linux? if itโs winos you need to be sure the package isnโt loaded prior to running covrpage (there is an issue in covr that jim explained that winos is funky in this regards) . I tried to force an unload of the package covrpage is running against in the last commit.
@njtierney greta looks amazing ( i installed tensorflow ), you may have won over a new user.
i think i found the underlying issues.
on the side of covrpage
testthat::set_max_fails(Inf)
to let testthat continue past the default limit of max fails to detect all the failing expectations.on the side of greta
the expectations are failing but so is the call itself. covrpage will try to comment out expectations so they are skipped by covr, but leaves the non-expectation lines alone.
eg
test_that("parallel reporting works", {
skip_if_not(check_tf_version())
m <- model(normal(0, 1))
op <- future::plan()
# put the future plan back as we found it
withr::defer(future::plan(op))
future::plan(future::multisession)
# should report each sampler's progress with a fraction
out <- get_output(. <- mcmc(m, warmup = 50, n_samples = 50, chains = 2))
# expect_match(out, "2 samplers in parallel")
# expect_match(out, "50/50")
})
where the error is (lines mismatch with master because i'm running off of a branch tf2-poke-tf-fun
)
Error (test_inference.R:441): parallel reporting works
<Rcpp::exception/C++Error/error/condition>
Error in `eval(expr, p)`: ValueError: Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor.
Backtrace:
1. greta:::get_output(...)
at test_inference.R:441:2
5. greta::mcmc(m, warmup = 50, n_samples = 50, chains = 2)
11. greta:::run_samplers(...)
at greta/R/inference.R:284:4
12. base::lapply(samplers, future::value)
at greta/R/inference.R:447:4
14. future:::value.Future(X[[i]], ...)
15. future:::signalConditions(...)
as you know covr is much more sensitive than testthat calls. For some reason testthat continues on after that line fails, where covr is borking on it.
if i comment out that line then everything runs fine and covrpage complete its run skipping over the offending expectations it finds.
you can also restructure the test like so and it will also get the same desired effect
test_that("parallel reporting works", {
skip_if_not(check_tf_version())
m <- model(normal(0, 1))
op <- future::plan()
# put the future plan back as we found it
withr::defer(future::plan(op))
future::plan(future::multisession)
# should report each sampler's progress with a fraction
#out <- get_output(. <- mcmc(m, warmup = 50, n_samples = 50, chains = 2))
expect_match(get_output(. <- mcmc(m, warmup = 50, n_samples = 50, chains = 2)), "2 samplers in parallel")
expect_match(get_output(. <- mcmc(m, warmup = 50, n_samples = 50, chains = 2)), "50/50")
})
Hi @yonicd !
Thanks for this package, this does exactly what I was looking for.
I'm having trouble getting this to run on the greta dev branch - here
I get the following error as it is knitting the document when running
covrpage::covrpage()
I'm fairly sure this is to do with the fact that greta is big and complex, but just wanted to flag it here in case it is