Closed arrjon closed 5 months ago
Attention: Patch coverage is 21.42857%
with 11 lines
in your changes are missing coverage. Please review.
Project coverage is 53.47%. Comparing base (
b1298ea
) to head (75bc446
).
Files | Patch % | Lines |
---|---|---|
pypesto/sample/dynesty.py | 8.33% | 11 Missing :warning: |
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Do we want to add this flag only in the dynesty sampler? I think it would be better to add it on the objective level.
My question is whether the constants are necessary? Why not define the loglikelihood of dynesty to always remove the prior in cases a prior exists? The only case when it should not remove the prior from the objective is when it does not exist, right? I might be missing something.
If you load a petab problem, that should work fine. But if you define your objective yourself (which could be the nllh) and you define x_priors
as well, you would compute something wrong without giving any warnings. That is not only a problem in dynesty, but also with the other samplers, see #1325.
The wrapper for the dynesty sampler differs significantly to the other samplers in pyPESTO. Usually the samplers expect the objective to be the posterior and
x_priors
to be defined, from which they compute the likelihood if needed. I added the argumentobjective_type
to the sampler to switch between posterior and likelihood with default set to posterior. I also added checks in theinitialize
function to very ifx_priors
is defined in the problem or not.Furthermore, I added a warning that by default, a uniform prior with bounds specified in the problem is assumed. This warning was missing, but already mentioned in the
prior_transform
function.