Open aarchiba opened 2 years ago
The NUTS sampler: https://arxiv.org/abs/1111.4246
Parallel tempering to get more parallelism: http://auai.org/uai2017/proceedings/papers/289.pdf
@abhisrkckl: this may be relevant to you
I think it is better to have an API that provides lnlikelihood, lnlikelihood_gradient, lnprior, and prior_transform than to have sampling methods baked into PINT itself. This will allow users to use the sampler of their choice rather than be tied to what PINT itself has.
I am working on an interface like this. I have not implemented lnlikelihood_gradient yet but it should be straightforward to do using the design matrix.
Related discussion in issue #1310
When derivatives of the objective function are available, MCMC methods that use them, for example Hamiltonian Monte Carlo, can be very much faster than MCMC methods that aren't. PINT has gone to great lengths to make derivatives available, so we should do this. PyMC3 includes the supposedly tuning-free "NUTS" sampler, so it could probably be used for this, although there would probably be some plumbing work. This could probably work for
event_optimize
as well, where faster runtimes would be extremely welcome. I'm not sure how well it parallelizes.