-
@stanniszhou gave a talk by this title at PROBPROG 2020. Could AdvancedHMC be adapted to do this? HMC for discrete variables could be a game changer :)
https://stanniszhou.github.io/papers/mixed_hm…
-
### Presentation of the new sampler
https://arxiv.org/abs/2110.00610
### How does it compare to other algorithms in blackjax?
Authors claim up to 5x more ESS / gradient evaluation.
### Where d…
rlouf updated
2 months ago
-
Suggestion: add a 3rd example to [tfp.mcmc.HamiltonianMonteCarlo](https://www.tensorflow.org/probability/api_docs/python/tfp/mcmc/HamiltonianMonteCarlo) showing how to infer the posterior parameters o…
-
# Context
Documenting field-level explicit likelihood inference from a differentiable cosmological model.
In [code](https://github.com/hsimonfroy/montecosmo/blob/a7346788b5555b2f6b14bff3e9dc2f4c9…
-
My entries in particular all over the shop with this!
What style should we use? Harvard? For clarity this means of the below form,
Girolami, M. and Calderhead, B., 2011. Riemann manifold langevi…
-
I run Hamiltonian Monte Carlo on 4 copies of my model for 10^5 steps on a GPU.
Each copy of the model contains about 1000 parameters. The log-likelihood function contains `tf.scan`. The main (cpu) m…
-
-
Hamiltonian Monte Carlo methods
---------------
1. Vanilla HMC
2. Riemannian Manifold HMC
3. Lagrangian HMC
-
* [Link](https://arxiv.org/abs/1906.10652)
* Title: Monte Carlo Gradient Estimation in Machine Learning
* Keywords (optional):
* Authors (optional):
* Reason (optional):
* Summary (option…
-
#### Summary:
The number of log prob gradient evaluations per sample is one greater than the reported `n_leapfrog` for that sample.
This does not need to be so; the gradient for the starting point w…