blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.83k stars 759 forks source link

KLqp ignores the ancestors' values in data input parameter #900

Open rcabanasdepaz opened 6 years ago

rcabanasdepaz commented 6 years ago

In the code below, a model with 2 normal variables is defined, i.e. y=2*x.

Assume that I aim to learn a q(y) distribution approximating p(y|x=x_test). I expect to obtain a normal distribution with mean close to 2*x_test. This does not work for the ed.KLqp algorithm, so I guess that data={x: x_test} is ignored. For using this algorithm, I am forced to used the ed.copy for fixing x to x_test. On the other hand, the method ReparameterizationKLqp works as expected.

Is this a bug? Could anyone explain why does it happen?

I am using edward 1.3.5 and tensorflow 1.5.0

Thanks in advanced

import tensorflow as tf
import edward as ed
import numpy as np
from edward.models import Normal

N = 100

# model definition
x = Normal(loc=tf.zeros((N,1)), scale = 1.)
y = Normal(loc=2*x, scale= 1.)

# input data
x_test = np.random.normal(loc=10.,scale=5.,size=N).reshape((N,1)).astype("float32")

# output predictive distribution
qy = Normal(loc=tf.Variable(tf.zeros((N,1)), dtype="float32", name="loc/y_pred"),scale=1.)

# inference
ed.KLqp({y: qy}, data={x: x_test}).run()                    # not working as expected
ed.get_session().run(qy.loc)

ed.KLqp({ed.copy(y, {x: x_test}): qy}, data={}).run()        # working as expected
ed.get_session().run(qy.loc)

ed.ReparameterizationKLqp({y: qy}, data={x: x_test}).run()  # working as expected
ed.get_session().run(qy.loc)

ed.ReparameterizationKLqp({ed.copy(y, {x: x_test}): qy}, data={}).run()        # working as expected
ed.get_session().run(qy.loc)