blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.84k stars 759 forks source link

same posterior mean of latent variable with different n_iter in inference.run() #424

Open ruzihao opened 7 years ago

ruzihao commented 7 years ago

sorry for forgetting to provide description.

I was implementing LDA model and was able to get the posterior mean of certain latent variable with no doubt.

The problem is, whichever the n_iter I specified in inference.run(), the model gives me the exact same posterior mean of the latent variable.

Usually, with more iterations to run, we should have a finer result (at least a different one), right? Any thoughts about this?

dustinvtran commented 7 years ago

could you give a reproducible example? one thing to note is that inference.run(n_iter=n_iter) does everything, so it will reinitialize all computation and do inference for n_iter iterations again. if you want to examine the posterior after every X number iterations, i recommend looking at inference.update(), which runs computation for one iteration at a time.

as an example, you can check out the inference piece of examples/mixture_gaussian_collapsed.py.