-
Hi Georges
Thanks for this very nice implementation.
I'm hoping to use this alongside a continuous FM model but ideally would need the use of an ODE formulation for the 'log-likelihood' - at fi…
-
My understanding: the picture above has a network $I_\phi $, which is the author's aim to play a role of one-shot inference. The network is trained with variational method and the lower bound is …
-
-
Most of the more popular probabilistic programming languages have implementations of variational inference (VI). As such, it's absence in monad-bayes is something of an obstacle to real-world use.
…
-
Here is to discuss about the design concerning variational inference methods.
So far in AugmentedGaussianProcesses.jl things are done this way:
- There are two functions `∇E_μ` and `∇E_Σ` which r…
-
We allow `model.fit(method="advi")`, but we have largely ignored what happens next for the user. We should improve this situation. We should at least improve two aspects what object we return when us…
-
@torfjelde the BNN tutorial is failing because
```julia
update(q, (μ, exp.(ω)))
```
doesn't seem to work anymore, because `update` doesn't seem to be exported anymore. I tried calling
```…
-
References:
http://papers.nips.cc/paper/2172-vibes-a-variational-inference-engine-for-bayesian-networks.pdf
http://www.jmlr.org/papers/volume6/winn05a/winn05a.pdf
-
To support VI we need to decide what different aspect we should implement.
## Data structure
We probably need a special data structure for VI results.
It could contain:
- posterior mean
- p…
-
The current tutorials cover a majority of MCMC. Could we get one for variational inference? The edward tutorial on Supervised Learning shows how to run inference using Kullback-Leibler divergence. It …