blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.83k stars 759 forks source link

a VIBES-like modeling language #27

Closed dustinvtran closed 8 years ago

dustinvtran commented 8 years ago

It'd be nice to have a language to specify directed acyclic graphs, so we can take advantage of Rao-Blackwellization/Markov blankets. Also it'd be nice to take advantage of the case when all full conditional distributions are an exponential family.

jluttine commented 8 years ago

Hey, great work with Edward! Do you have any insight how difficult it would be to add support for conjugate-exponential family models (or sub-graphs) so that exact gradients could be utilized (as in variational message passing)? Would it be possible or straightforward with the current machinery? I suppose it would require that the log-density functions are written as functions of the sufficient statistics instead of the variables directly and that seems like a very big change to the current code base. Anyway, I might be interested to take a look on this if I find the time but any pointers or comments would be helpful. I can also open another issue, if you consider this to be off-topic. Cheers!

dustinvtran commented 8 years ago

Hi @jluttine: The main difficulty is in the modeling language. I'm not familiar with a language that would let us take advantage of that structure. However, once it exists it is straightforward with the current machinery in a similar way you mention. The exact details depend on how fully conditional densities are encoded in the language.

dustinvtran commented 8 years ago

Hi Dustin,

I read you blog today. I was struck with your amazing works related to variational Bayesian Inference.

I am also into Bayesian research but more into its applications for automating stuff. I had implemented an algorithm for learning parameters for mixed Bayesian networks. But that was in Matlab.

I want to do that in python, does any of your library support Bayesian parameter learning for mixed networks from data? Or is it possible to do that using variational approximation some how.

Please help.

Thanks and regards Aakanksha

Hi Aakanksha,

The library supports inference for arbitrary joint distributions (that is, any model which can be written as a tractable joint density of data and latent variables). This includes Bayesian networks, although I haven’t found a good Bayesian network/graphical modeling language in Python, so that we can take advantage of the Markov blanket structure. Once there is a way to obtain the Markov blanket for any individual latent variable, then we can do very fast variational inference on graphical models.

For more information see https://github.com/blei-lab/edward/issues/27. If you have bandwidth to look into that direction, then it would be huge for the graphical modeling community. Hope that helps!

Dustin

datnamer commented 8 years ago

https://github.com/pgmpy/pgmpy Have you see this? "Python Library for Probabilistic Graphical Models"

Aakanksha-B commented 8 years ago

@dustinvtran libpgm developed by students under Daphne Koller has module to learn the structure of a Bayesian network. Markov blanket can be easily found from the structure of the network.

@datnamer Yeah i have seen it. Faced some issues related to - prior specification and belief propagation which have not been resolved yet. Quite doubtful about structure learning available in it as well.

dustinvtran commented 8 years ago

(for my own bookkeeping)

Hakaru (https://github.com/hakaru-dev/hakaru) has incredible symbolic computation where given a normal-normal model, it will compile and return the normal model marginalizing over the latent variable. This is a pretty cool way to deal with conjugacy. Moreover, it is a first step to thinking about compile-time optimization beyond just deterministic things (e.g., collapsing reducible operations) to operations with randomness.

We should look into how they deal with conjugacy.

suhailshergill commented 8 years ago

@dustinvtran you might find some of the following research related to hakaru useful:

dustinvtran commented 8 years ago

Nice. Thanks for the references @suhailshergill. They are indeed very useful! (I was originally looking through the documentation but didn't get much out of it; I'll browse these related papers instead.)

suhailshergill commented 8 years ago

np :)

suhailshergill commented 8 years ago

btw, @dustinvtran if/when you're in toronto give me a heads up. we would love to have you (and others) present in our probabilistic programming meetup here

suhailshergill commented 8 years ago

@dustinvtran PS github doesn't show ftp links, but i've edited my comment for the computer algebra paper

dustinvtran commented 8 years ago

Yeah, will definitely give you a heads-up! Also letting others in the group know.

rokgerzelj commented 8 years ago

Hey, any updates on this?

dustinvtran commented 8 years ago

You can see preliminary work on the branch feature/metagraph. The pull request #192 makes progress from Edward's current API to support various things in the language.

There is a enormous number of challenges in the language design, which I've been furrying my brows to solve these past two months. Here are some concepts floating around to get a sense of what I'm working with:

  1. Leveraging TensorFlow for math operations
  2. Stochastic tensors and stochastic control flow
  3. Operations on random variables vs operations on samples
  4. Delayed construction of computational graphs
  5. Separation of model and inference
  6. Local vs global, i.e., what features go into a model "container" vs the individual random variables

Things will be more concrete as we understand some of these ideas better.