BasicProbability / BasicProbability.github.io

website for course about basic probability within the Master of Logic at University of Amsterdam
0 stars 3 forks source link

What Info Theory do we need? #23

Open philschulz opened 8 years ago

philschulz commented 8 years ago

I started with ch. 6. I now define entropy and show the binary entropy diagram. In addition, I will transfer the results we have for expectation (like the entropy of independent RVs is additive). I will finish by introducing the idea of KL divergence. Is there anything else you want me to put in?

cschaffner commented 8 years ago

Wow, chapter 5 is quite a mouthful! ;-) It looks pretty good (and long) to me. See my changes in the pull request

cschaffner commented 8 years ago

I don't think we have to insert more information theory than what you think is necessary. There seems to be material enough.