blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.83k stars 760 forks source link

Rewriting tutorials and scipy.stats #360

Closed bhargavvader closed 7 years ago

bhargavvader commented 7 years ago

This is mostly with regard to the changes in #344. The example scripts haven't been rewritten to reflect this, and this should be done.

Also, for example, the examples are sometimes broken - in examples/tf_gp_classification.py, when you use scipy.stats to import bernoulli, multivariate_normal instead of using edward.stats, it throws up this error (and it is because of scipy.stats) -

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/inference.py:171: DeprecationWarning: Model wrappers are deprecated. Edward is dropping support for model wrappers in future versions; use the native language instead.
  "native language instead.", DeprecationWarning)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-5-8baff3139162> in <module>()
     88 data = {'x': X_train, 'y': y_train}
     89 inference = ed.KLqp({'z': qz}, data, model)
---> 90 inference.run(n_iter=500)

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/inference.pyc in run(self, logdir, variables, use_coordinator, *args, **kwargs)
    209       Passed into ``initialize``.
    210     """
--> 211     self.initialize(*args, **kwargs)
    212 
    213     if logdir is not None:

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/klqp.pyc in initialize(self, n_samples, *args, **kwargs)
     61     """
     62     self.n_samples = n_samples
---> 63     return super(KLqp, self).initialize(*args, **kwargs)
     64 
     65   def build_loss_and_gradients(self, var_list):

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/variational_inference.pyc in initialize(self, optimizer, var_list, use_prettytensor, *args, **kwargs)
    102 
    103     if getattr(self, 'build_loss_and_gradients', None) is not None:
--> 104       self.loss, grads_and_vars = self.build_loss_and_gradients(var_list)
    105     else:
    106       self.loss = self.build_loss()

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/klqp.pyc in build_loss_and_gradients(self, var_list)
    101       #    loss = build_reparam_entropy_loss(self)
    102       else:
--> 103         loss = build_reparam_loss(self)
    104 
    105       if var_list is None:

/Users/bhargavvader/Open_Source/edward/venv/src/edward/edward/inferences/klqp.pyc in build_reparam_loss(inference)
    359     else:
    360       x = inference.data
--> 361       p_log_prob[s] = inference.model_wrapper.log_prob(x, z_sample)
    362 
    363   p_log_prob = tf.pack(p_log_prob)

<ipython-input-5-8baff3139162> in log_prob(self, xs, zs)
     66     x, y = xs['x'], xs['y']
     67     log_prior = multivariate_normal.logpdf(
---> 68         zs['z'], tf.zeros(self.N), self.kernel(x))
     69     log_lik = tf.reduce_sum(
     70         bernoulli.logpmf(y, p=self.inverse_link(y * zs['z'])))

/Users/bhargavvader/Open_Source/edward/venv/lib/python2.7/site-packages/scipy/stats/_multivariate.pyc in logpdf(self, x, mean, cov, allow_singular)
    474 
    475         """
--> 476         dim, mean, cov = self._process_parameters(None, mean, cov)
    477         x = self._process_quantiles(x, dim)
    478         psd = _PSD(cov, allow_singular=allow_singular)

/Users/bhargavvader/Open_Source/edward/venv/lib/python2.7/site-packages/scipy/stats/_multivariate.pyc in _process_parameters(self, dim, mean, cov)
    368                         dim = cov.shape[0]
    369             else:
--> 370                 mean = np.asarray(mean, dtype=float)
    371                 dim = mean.size
    372         else:

/Users/bhargavvader/Open_Source/edward/venv/lib/python2.7/site-packages/numpy/core/numeric.pyc in asarray(a, dtype, order)
    480 
    481     """
--> 482     return array(a, dtype, copy=False, order=order)
    483 
    484 def asanyarray(a, dtype=None, order=None):

ValueError: setting an array element with a sequence.
dustinvtran commented 7 years ago

You can replace edward.stats with scipy.stats if you only sample from the methods, e.g., bernoulli.rvs and multivariate_normal.rvs.

The above raises an error because it also tries to calculate log-densities; these must be written in TensorFlow. In such a case, you'd have to construct Edward random variables and then call their methods. For example in the line that raises the error:

from edward.models import MultivariateNormalFull

multivariate_normal = MultivariateNormalFull(tf.zeros(self.N), self.kernel(x)))
log_prior = multivariate_normal.logpdf(zs['z'])

(Note In practice I don't see this as a use case. Users will either use both deprecated features—edward.stats and model wrappers—and not only one.)

bhargavvader commented 7 years ago

This makes sense, thanks for the reply! Do you think it's worth rewriting any tutorials to take care of the deprecated features?

bhargavvader commented 7 years ago

Okay, I can see this is already being done, like with the supervised tutorial and the Mixture Density Networks.

Is someone already doing this particularly one?

dustinvtran commented 7 years ago

Not yet. All the tutorials have so far been rewritten in the native language excluding two: the mixture of Gaussians and the mixture density network. There's an outstanding pull request (#350) for these two, which I haven't gotten to work. I welcome any help!