Closed yebai closed 1 year ago
@sethaxen any thoughts on how to design this API better?
cc @devmotion
But that interface is supported, no? After #301 ?
I think we still need to explicitly construct the Hamiltonian
type or LogDensityModel
type and pass it to the sample
function. Some cosmetic changes are needed in the abstractmcmc
interface implementation to support an intuitive interface like the one in https://github.com/TuringLang/AdvancedMH.jl/pull/76.
I think we still need to explicitly construct the Hamiltonian type and pass it to the sample function
Nah, you can just construct the HMCSampler
and pass that it:)
I think we still need to explicitly construct the Hamiltonian type and pass it to the sample function
Nah, you can just construct the HMCSampler and pass that it:)
@torfjelde can you give an example of what you mean?
In my view a user should be able to entirely configure the warm-up period and sampling period without requiring they specify initializations for step size, metric, and initial point, but allowing for the possibility that they specialize the initializations.
It should also be really easy to configure the warm-up phase(s). e.g. it should be possible to make initial step size selection part of the warm-up phase. It should be possible to initialize the sampling with a metric of one type but then adapt a different type of metric.
Maybe some of these are already possible. If so, awesome! It would be handy to document such cases. If it becomes possible to use Turing models directly with the sampler without needing to define a new sampler as Turing does, that would be amazing!
use Turing models directly with the sampler without needing to define a new sampler as Turing does, that would be amazing!
That's one nice goal for Turing (a by-product of the goal we want thoroughly decouple model specification and inference). We are very close to that point!
I looked at DynamicHMC
's interface mcmc_with_warmup
and mcmc_keep_warmup
. What we need to include here is merely a well-documented user-facing API.
@devmotion, maybe we can make the AbstractMCMC.sample
interface more intuitive so users can have a nice way to specify adaption, initialisation, algorithm choice etc. It seems Julia people love NamedTuple
more than verbose arguments...
maybe we can make the AbstractMCMC.sample interface more intuitive so users can have a nice way to specify adaption, initialisation, algorithm choice etc. It seems Julia people love NamedTuple more than verbose arguments...
AbstractMCMC does not restrict how samplers want their options to be specified, it only adds some standard options such as discard_initial
, thinning
, and progress
: https://turinglang.org/AbstractMCMC.jl/dev/api/#Common-keyword-arguments It's completely up to downstream packages if they want to support these, if they want to add more options (often the case, I assume), and if they want to nest options as named tuples. So I think specifying/changing options in AdvancedHMC does not (necessarily) require any changes in AbstractMCMC.
Generally, I think there's not a single design choice that's used everywhere in the ecosystem. SciML (in most places) uses a large set of (non-nested) keyword arguments, DynamicHMC nests some options in NamedTuple (I guess this might be useful mostly in cases where you have some hierarchy of options), and Optim supports options as Optim.Options
.
We should consider supporting the following
AbstractMCMC.sample
interface (defined here)This allows users to pass a
LogDensityProblem
to thesample
function instead of ahamiltonian
object. We could default to Gaussian Kinect energy and canonical moment refreshment scheme. Moreover, we should define a few off-the-shelfsampler
types rather than requiring the user to construct anHMC
kernel from technically frightening components likeleapfrog
and trajectorytermination
types.Such an intuitive
sample
interface can even be helpful forTuring
models, allowing users to run MCMC without using out-of-date glue interfaces (correctly) criticised here.AdvancedMH
: https://github.com/TuringLang/AdvancedMH.jl/pull/76