depthfirstlearning / depthfirstlearning.com

Other
196 stars 29 forks source link

InfoGAN: Looking for good explanation for relationship between JS divergence, Jensen's inequality and Shannon Entropy #2

Open avital opened 6 years ago

avital commented 6 years ago

Why is Jensen-Shannon divergence called this way?

The answer is something like this: https://dit.readthedocs.io/en/latest/measures/divergences/jensen_shannon_divergence.html#derivation (where "x" is a convex combination of P and Q), and the expectation is taken over the binary variable defining which of {P,Q} to take

We'd like a clear write-up of this definitely of the JS divergence, including a proof of the equivalence between this definition and the others.