AllenDowney / ThinkBayes2

Text and code for the forthcoming second edition of Think Bayes, by Allen Downey.
http://allendowney.github.io/ThinkBayes2/
MIT License
1.8k stars 1.49k forks source link

Chapter 4 - Proper Prior Probability Initialisation #73

Open iamyifan opened 6 months ago

iamyifan commented 6 months ago

Hi developer,

The original prior probability prior = Pmf(1, hypos) in Ch.4 is initialised with 1:

image

It's better to be initialised with prior2 = Pmf(Fraction(1, len(hypos)), hypos), which is a uniform distribution, and its sum adds up to 1.

The results and the final plot are the same under np.allclose(posterior, posterior2):

image image

Also, for a better understanding, the loop inside the function should be given a more detailed explanation:

image

For example: The reason we can use a loop to multiply likelihood is that each coin-flipping experiment is independent of others, hence the $P(\theta)$ represents the probability of a coin landing in its head in the Bayesian theorem, $P(D_x|\theta)$ represents $x$-th coin-flipping experiment, where $D_x$ can be 'H' or 'T'. The loop represents the process of $P(D_1|\theta) \times P(D_2|\theta) \times ... \times P(D_n|\theta) = P(D|\theta)$ since they are independent of each other.

Cheers, Yifan