Closed mbesserve closed 7 years ago
Thanks for asking. VariationalInference
is an abstract class. You need to use a subclass of VariationalInference such as KLqp
, KLpq
, or MAP
. Also see http://edwardlib.org/api/inference-classes.
That said, the error message should be more informative; I added an issue to solve this.
Hi Dustin. Thanks for the quick reponse. I forgot to mention that I tried that already with KLpq and KLqp and got the same error message.
In the doc for inference hybrids, the ...
means that there are intermediate steps left out for brevity. You need to initialize the algorithms before being able to call update()
. Also see http://edwardlib.org/api/inference and the Monte Carlo EM example in https://github.com/blei-lab/edward/blob/master/examples/factor_analysis.py.
Thanks again. Indeed I missed those ...
. If I do initialize inference_e (assuming it uses default parameters if I do not input any), it outputs a TypeError I have difficulties to understand (I have had many of those trying to make the inference work)
TypeError: cat must be a Categorical distribution, but saw: Tensor("inference_614459625544/0/Categorical_3/sample/Reshape_1:0", shape=(500,), dtype=int32)
As far as I understand the variables are correctly defined as categorical. I have been trying to implement the inference for such models in several ways (before trying the EM approach) and always ended up an incompatibility of some sort. I apologize for the beginner questions, I will look at the Monte Carlo EM example in detail, maybe I ll find the answer there.
The Mixture random variable sums out the mixture assignments, so you shouldn't be trying to infer them. If you aim to do an variational E-step to infer latent mixture assignments, the observed variables are just the normal distributions. See, e.g., http://edwardlib.org/tutorials/unsupervised for the difference between a mixture of Gaussians and a collapsed mixture of Gaussians.
I apologize for the beginner questions, I though I could use these tools without having to delve into the underlying implementation details but it might just not be realistic.
Not at all! Questions like these are highly encouraged. Whatever's not clear to you is not your mistake but the documentation's.
If you have more questions, feel free to post on the Forum (https://discourse.edwardlib.org). Closing as it isn't a bug.
I am trying to implement variational inference for linear combinations of mixtures of Gaussians in the flavor of Attias, H. "Independent Factor Analysis." Neural Computation 11.4 (1999): 803-851.
In the following I am just showing a 1D example where I am essentially trying to implement a variational EM algorithm with an E step on the categorical latent variables and an M step on the mixture parameters. I took inspiration from the http://edwardlib.org/api/inference-compositionality , and cannot figure out what is missing.
Running the update of the E step leads to
I might have missed something but as I have now reduced the model to a rather minimal one (univariate, two free parameters), I have no idea how to fix this.