flatironinstitute / bayes-kit

Bayesian inference and posterior analysis for Python
MIT License
43 stars 3 forks source link

partial momentum refresh MALA #23

Open bob-carpenter opened 1 year ago

bob-carpenter commented 1 year ago

Add an implementation of partial momentum refresh (aka underdamped) MALA. The idea is to only partially refresh the momentum each iteration. This method was introduced in

I found Radford Neal's description to be the clearest (page 5, displayed equations in middle), where the momentum (which I'm calling rho to match our other notation) is updated for mixture rate alpha in (0, 1) as

z ~ normal(0, I)
rho[n] = alpha* rho[n-1] + sqrt(1 - alpha^2) * z

With alpha = 0, we get back standard MALA and results should match. As alpha increases, the fraction refreshed decreases and induces more persistence of momentum. With alpha = 1, we get no refresh and the result is a single, pure Hamiltonian trajectory that continues each iteration (never leaving the same level set of the Hamiltonian other than for arithmetic/discretization error).