If we make the language model example also, we can do 2 rows and 2 columns. Or, if we have enough width, we can do 1 row and 4 columns.
[x] As mentioned in previous feedback, when you explain the Markov chain, please explain what is a state. You can use the above image for illustration.
Parameters of Markov chain
[x] First line should be" Let us now understand the parameters of a Markov chain. We can write the factorisation of our Markov chain as: $P(x_1, ...) = P(x1)\prod{t=2}^TP(xt|x{t-1})$
[x] After mentioning K states, directly write the following in the next sentence. Markov chains leverage parameter sharing and instead of specifying P(x_t|x_t-1) for each t, we assume P(x_t|x_t-1) to be common (shared) across all time.
[x] This would be a good place to put your Markov chain FSMs. Put them in the same fashion as the drawing I have made. This will keep the diagram clean and consistent.
Markov chain sampling
[x] This should be Markov chain sampling and not Markov model sampling.
[x] The last line in the paragraph is not needed.
[x] Is recursively the correct word? Better use iteratively or sequentially.
[x] Please correct the Markov model to Markov chain wherever necessary.
[x] For easy visualisation and consistency, please make the FSM consistent with your drawings. That is show the starting state also with edges going to Sunny and Rainy. Use the same colour scheme for Sunny, Rainy nodes everywhere, including the diagrams I gave you.
[x] In the interactive plot, put a very light gray background for Generated Sequence, just to keep it separated.
[x] As mentioned in the previous feedback, please label Transition matrix (A) and Prior Probabily (Pi) in the interactive plot.
Hidden Markov model
[x] The starting text is not impactful.
[x] Just use the biased coin example. Say something like, previously in our unfair coin toss example, we could observe whether the coin tossed was fair or biased. But, imagine, that instead of observing which coin was tossed, we only get to observe whether the coin lands heads or tails. Thus, the "hidden" in ...
[x] With this context explain the unrolled version of HMM for the three/four examples using the LibreOffice drawing I made as a reference.
[x] Ensure that the hidden nodes are not shaded.
[x] Then draw the FSMs and discuss HMM parameters.
Overall
Sequential modeling
[x] always run through grammarly. In Sequential modeling, there is no space between Model and (HMM).
[x] Make last sentence in sequential modeling shorter: for time-series data and often used for above mentioned applications.
[x] Remove the last line (In the sequence above...) before Markov chain paragraph. It is not adding value.
[x] Ensure you introduce that shaded nodes mean observed.
Markov Chain
[x] The first line is not required. It does not add value.
[x] Capitalize Markov in second sentence. It should be: A Markov chain is the simplest Markov model.
[x] Please restore the notation I had written given the present ($x_t$),....
[x] There is a lot of space after P(x1, x2, ...) equation and before We now discuss some common day-day
[x] I am not sure of dat-to-day being a common phrase. Furthermore, the word day-to-day is not adding any value. Capitalize Markov in this sentence.
[x] Instead of showing the FSM for sunny, rainy we should first show what examples of Markov chains for these examples. See this image.
Wherever possible, we should use neat images like this which are self-explanatory. NB - I drew this image in LibreOffice.
hmm-diagrams.pptx
If we make the language model example also, we can do 2 rows and 2 columns. Or, if we have enough width, we can do 1 row and 4 columns.
Parameters of Markov chain
Markov chain sampling
Hidden Markov model