nipunbatra / hmm

https://nipunbatra.github.io/hmm/
Creative Commons Attribution 4.0 International
4 stars 0 forks source link

Feedback April 28 #2

Open nipunbatra opened 4 years ago

nipunbatra commented 4 years ago

Authors

Title

Sequential modeling

Markov model

Markov model sampling

Text generation using Markov chains

HMM section

HMM parameters

HMM sampling

HMM evidence likelihood

Model evidence

On the right hand side in this GIF put the main question: P(H, H, H | Z1=B, Z2= B, Z3 = B) = ....P(Z1=B). P(Z2=B|Z1=B).... This way it would be easy to connect the likelihood calculations.

This GIF can be slowed down.

Finally, in the text we need to somehow succinctly say that P(HHH|theta) is the sum of probabilities of the eight paths.

Then, go and generalize to K^T. Simplify the text surrounding T*K^T and maybe just write exponential in K.

Then, explain this is what forward algorithm does and introduce the convention. Then see what all text can be deleted. Do the same for backward. Before explaining backward, explain why do we even need it? What HMM problem does it help solve?