Closed drbenvincent closed 7 months ago
Pfft. These doctests are brittle. Despite thinking it was a great idea, at the moment they are causing more hassle than catching actual problems. Not sure what you think @jpreszler?
There are definitely some that are particularly problematic, and will continue to be so without some significant changes. I think the value comes over the long-term, but it will be hard to get there without some improvements. The options I see:
My vote would be to temporarily turn off the tests and replace the examples for at least PrePostFit (flakey), RegressionDiscontinuity (flakey) and the Instrumented Variables (slow) and turn doctests back on. I should be able to work on this soon, perhaps even getting better examples over the next week.
created https://github.com/pymc-labs/CausalPy/issues/284. For now, I think you can just comment out the doctest call in the CI untill that issue is resolved.
Note to self: I'll finish and merge this PR after #286 is merged
Attention: 7 lines
in your changes are missing coverage. Please review.
Comparison is base (
6283c76
) 76.50% compared to head (7e41931
) 76.50%.:exclamation: Current head 7e41931 differs from pull request most recent head 22f93ff. Consider uploading reports for the commit 22f93ff to get more accurate results
Files | Patch % | Lines |
---|---|---|
causalpy/pymc_experiments.py | 41.66% | 7 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Well, I guess I have to increase test coverage to make codecov pass.