In your article Visual Information Theory, you set up the Kullback–Leibler divergence just before footnote reference five. I believe you've switched the two images when showing the difference between the entropy of p and the cross entropy of p with respect to q and vice versa. I've switched them.
Amazing blog by the way. I came across it while reading through the TensorFlow tutorial and plan to go through all of your posts. You have a great way of explaining things in an intuitive manner! Thanks for all of the hard work you've put into it.
Hey Christopher,
In your article Visual Information Theory, you set up the Kullback–Leibler divergence just before footnote reference five. I believe you've switched the two images when showing the difference between the entropy of p and the cross entropy of p with respect to q and vice versa. I've switched them.
Amazing blog by the way. I came across it while reading through the TensorFlow tutorial and plan to go through all of your posts. You have a great way of explaining things in an intuitive manner! Thanks for all of the hard work you've put into it.