Closed dustinvtran closed 8 years ago
Hey, great work with Edward! Do you have any insight how difficult it would be to add support for conjugate-exponential family models (or sub-graphs) so that exact gradients could be utilized (as in variational message passing)? Would it be possible or straightforward with the current machinery? I suppose it would require that the log-density functions are written as functions of the sufficient statistics instead of the variables directly and that seems like a very big change to the current code base. Anyway, I might be interested to take a look on this if I find the time but any pointers or comments would be helpful. I can also open another issue, if you consider this to be off-topic. Cheers!
Hi @jluttine: The main difficulty is in the modeling language. I'm not familiar with a language that would let us take advantage of that structure. However, once it exists it is straightforward with the current machinery in a similar way you mention. The exact details depend on how fully conditional densities are encoded in the language.
Hi Dustin,
I read you blog today. I was struck with your amazing works related to variational Bayesian Inference.
I am also into Bayesian research but more into its applications for automating stuff. I had implemented an algorithm for learning parameters for mixed Bayesian networks. But that was in Matlab.
I want to do that in python, does any of your library support Bayesian parameter learning for mixed networks from data? Or is it possible to do that using variational approximation some how.
Please help.
Thanks and regards Aakanksha
Hi Aakanksha,
The library supports inference for arbitrary joint distributions (that is, any model which can be written as a tractable joint density of data and latent variables). This includes Bayesian networks, although I haven’t found a good Bayesian network/graphical modeling language in Python, so that we can take advantage of the Markov blanket structure. Once there is a way to obtain the Markov blanket for any individual latent variable, then we can do very fast variational inference on graphical models.
For more information see https://github.com/blei-lab/edward/issues/27. If you have bandwidth to look into that direction, then it would be huge for the graphical modeling community. Hope that helps!
Dustin
https://github.com/pgmpy/pgmpy Have you see this? "Python Library for Probabilistic Graphical Models"
@dustinvtran libpgm developed by students under Daphne Koller has module to learn the structure of a Bayesian network. Markov blanket can be easily found from the structure of the network.
@datnamer Yeah i have seen it. Faced some issues related to - prior specification and belief propagation which have not been resolved yet. Quite doubtful about structure learning available in it as well.
(for my own bookkeeping)
Hakaru (https://github.com/hakaru-dev/hakaru) has incredible symbolic computation where given a normal-normal model, it will compile and return the normal model marginalizing over the latent variable. This is a pretty cool way to deal with conjugacy. Moreover, it is a first step to thinking about compile-time optimization beyond just deterministic things (e.g., collapsing reducible operations) to operations with randomness.
We should look into how they deal with conjugacy.
@dustinvtran you might find some of the following research related to hakaru useful:
ftp://ftp.cs.indiana.edu/pub/techreports/TR719.pdf
Nice. Thanks for the references @suhailshergill. They are indeed very useful! (I was originally looking through the documentation but didn't get much out of it; I'll browse these related papers instead.)
np :)
btw, @dustinvtran if/when you're in toronto give me a heads up. we would love to have you (and others) present in our probabilistic programming meetup here
@dustinvtran PS github doesn't show ftp
links, but i've edited my comment for the computer algebra paper
Yeah, will definitely give you a heads-up! Also letting others in the group know.
Hey, any updates on this?
You can see preliminary work on the branch feature/metagraph
. The pull request #192 makes progress from Edward's current API to support various things in the language.
There is a enormous number of challenges in the language design, which I've been furrying my brows to solve these past two months. Here are some concepts floating around to get a sense of what I'm working with:
Things will be more concrete as we understand some of these ideas better.
It'd be nice to have a language to specify directed acyclic graphs, so we can take advantage of Rao-Blackwellization/Markov blankets. Also it'd be nice to take advantage of the case when all full conditional distributions are an exponential family.