vasishth / bayescogsci

Draft of book entitled An Introduction to Bayesian Data Analysis for Cognitive Science by Nicenboim, Schad, Vasishth
100 stars 27 forks source link

Multinomial Processing Tree #20

Closed AlexSwiderski closed 2 years ago

AlexSwiderski commented 2 years ago

Hello!

I just stumbled upon your wonderful book, major kudos to you all. I have worked pretty closely with the Walker et al MPT model and do have a question/suggestion. Feel free to ignore :D

One typo i think Secion 19.2.1.1 you state that "By navigating through the branches of the MPT (Figure 19.3), we can calculate the probabilities of the four responses (the categorical outcomes), based on the underlying parameters assumed in the MPT." Should this be 5 categorical outcomes (NR, Neologism, Formal, Mixed, and Correct)?

Second, and more aimed at my comment here. May I ask where the true underlying values for the simulated data came from? I am building a new MPT model with eye tracking data and thought it would be fun to run through this code with a new data set, but where those true values come from has me confused (potentially other readers too).

Section 19.2.1.2 Generate Simulated data

# true underlying values for simulated data
a_true <- .75 
t_true <- .9
f_true <- .8 
c_true <- .1 

Cheers Alex

bnicenboim commented 2 years ago

Hi thanks, for catching that typo!

Sorry to say that I think I just made up the true values based on nothing really :) It's just an exercise, and the original MPT model was much more complex (and actually based on sound or at least justified assumptions), you should use that one for a new dataset and not the one in the book.

Best, Bruno

vasishth commented 2 years ago

Perhaps we should mention in the chapter that we values are just for illustration.

bnicenboim commented 2 years ago

yes, good idea.

AlexSwiderski commented 2 years ago

From the outsiders perspective, both the typo fix and the clarification of the original proportion data of the error count would have 100% eliminated my confusion.

thanks again for all this hard work and being so open to my comment!