Closed theogf closed 2 years ago
Merging #26 (ff5f188) into main (ca39ed3) will not change coverage. The diff coverage is
100.00%
.
@@ Coverage Diff @@
## main #26 +/- ##
=======================================
Coverage 94.91% 94.91%
=======================================
Files 3 3
Lines 59 59
=======================================
Hits 56 56
Misses 3 3
Impacted Files | Coverage Δ | |
---|---|---|
src/abstractmcmc.jl | 100.00% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update ca39ed3...ff5f188. Read the comment docs.
Is it reasonable to forward all keyword arguments? And do you have any application or downstream code that exploits it? Then it would be easier to see if we should use a different design or if it works sufficiently well. Would be good to add such examples to the tests as well.
I literally want to do:
function inital_sample(rng, m::MyModel; init_params, kwargs...)
return init_params
end
But I wanted to take an approach that would not affect the current package, hence the general kwargs
approach.
And sure I can add some tests.
BTW regarding initial parameters the plan is to move the initial_params
support in DynamicPPL/Turing to AbstractMCMC. The plan exists for quite some time now but it was not prioritized...
So ideally we would just support the AbstractMCMC interface for initial parameters but of course this would require the upstream changes first.
Is there any open PR yet? [EDIT:] Checking AbstractMCMC, it does not look like it.
Yeah, unfortunately not. I'll add it to my (growing) list of things that should be prioritized... Until then, I suggest following the convention in other Turing packages such as AdvancedMH (https://github.com/TuringLang/AdvancedMH.jl/blob/361a66905947799d7945818862673d07d5568b16/src/mh-core.jl#L78-L81) or DynamicPPL and supporting an explicit init_params
keyword argument with default value nothing
.
I can see the point. But is there a general performance impact on passing all kwargs
?
Additionally, do you want me to modify the current initial_sample
with
function initial_sample(rng, model::ESSModel; init_params=nothing)
return init_params == nothing ? rand(rng, prior(model)) ? init_params
end
I don't think so. But it seems easier to not have to overload any methods but just handle it in the implementation of step
here. I.e., write explicitly something like
f = init_params === nothing ? ... : init_params
...
Additionally, do you want me to modify the current
initial_sample
with
No, I think it is simpler to just handle init_params
in step
, similar to e.g. AdvancedMH.
Great, thanks @theogf!
The idea behing this PR is to allow different behavior for
initial_sample
by passing thekwargs
as well. Typically passing a defined starting point for the MCMC chain.