Closed reubenharry closed 10 months ago
Attention: 1 lines
in your changes are missing coverage. Please review.
Comparison is base (
f49945d
) 99.16% compared to head (70f1dd5
) 99.16%.
Files | Patch % | Lines |
---|---|---|
blackjax/mcmc/mclmc.py | 97.67% | 1 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
After a bit of a mammoth effort, the PR is ready for review.
One possible point of objection is that there are still some poorly named variables, now confined to certain functions in the adaptation code. These I have not renamed because I don't fully understand what all of them are doing, so some reading/talking to Jakob is in order.
But there's now a test of both the tuning and the kernel, that uses pytrees, showing that the code is pytree generic. Moreover, the implementation reproduces the original MCLMC implementation perfectly (with appropriate choices of random seeds), so other than a few details (I removed the preconditioning for now, for simplicity), it's hopefully looking good.
Todos:
main
commit;pre-commit
is installed and configured on your machine, and you ran it before opening the PR;explore.py
scriptAs per #530, this PR adds MCLMC (Microcanonical Langevin Monte Carlo) as a sampling method to BlackJax. It is an adaptation of the original repo: https://github.com/JakobRobnik/MicroCanonicalHMC/tree/master.
In particular, this is an implementation of the single chain partially momentum updating algorithm (as opposed to full updates every n steps). It also will implement (once this PR is ready) the MCLMC tuning algorithm.