Closed bhargavvader closed 4 years ago
I think this is a great idea, it would be also very useful for VI @ferrine
We already have KL divergence in https://github.com/pymc-devs/pymc3/blob/master/pymc3/variational/operators.py
My implementation is mostly for inference purposes. But wrapping it up can be not that difficult. I do not see a possibility to implementat most distances within VI module (thanks to dimensionality curse) but for univariare it makes sense. BTW it might be interesting http://jmlr.csail.mit.edu/papers/v13/gretton12a.html
Can softAbs be used for kernel Stein discrepancy?
I dont see why not, at least under the framework of the VI module it should be quite straight forward.
@ferrine's implementation won't fit into comparing between two distributions, but I'll see what can be done. Any other thoughts on what kind of API would be useful?
@bhargavvader are you thinking of an API more similar to the implementation in Tensorflow?
For example, here is how Edward call the kl_divergence
from Tensorflow:
https://github.com/blei-lab/edward/blob/master/edward/inferences/klqp.py#L451
kl_penalty = tf.reduce_sum([
inference.kl_scaling.get(z, 1.0) * tf.reduce_sum(kl_divergence(qz, z))
for z, qz in six.iteritems(inference.latent_vars)])
Closing due to inactivity, feel free to reopen.
Would there be any interest in having a class of distance metrics for Probability Distributions? An example could be like the very-pseduo code in this jupyter notebook.
I had some questions in case we do want this:
What should be the parameters? A
pymc3
ornumpy
/scipy
object of a distribution, and in this case, what should be returned?numpy array of distributions, and we return a numerical value? This notebook does that.
Ping @ColCarroll , @twiecki