pjve90 / LCV_RD_ABM

0 stars 0 forks source link

Why log-odds transformation? #21

Open pjve90 opened 1 month ago

pjve90 commented 1 month ago

In this line, when we perform 100 iterations (aka years) we get the log-odds of the block matrix, but when we run it for one iteration earlier in the code we do not do this. Also, I checked the function "transfers_blockprobs_fx.R" and we do not use the log-odds but the original probabilities in "blockmatrix".

I am confused if we either shall use the probabilities or the log-odds to know where to fix the code.

https://github.com/pjve90/LCV_RD_ABM/blob/d97253be205cc2842971dc75f3b9002002c40980/Model_code/ABM_code_final.R#L403C1-L404C51

danielRedhead commented 1 month ago

It's been a while, so I've basically forgotten the code and our conversations, sorry! I looked back and I think the last time we spoke about needing to use log-probabilities or real numbers because the softmax function expects them. So, just don't use probabilities, and play around with the values that you do use as input to make sure that they (and the resulting network) make sense!

If you use log-probabilities, think of them as the logarithm of the likelihood of each tie, and softmax will translate these into a probability distribution. If you input any kind of arbitrary real number (e.g., 9 or 4) think of them as representing preference/importance/utilities, and softmax turns these into a probability distribution where higher scores correspond to higher probabilities.