jean997 / cause

R package for CAUSE
https://jean997.github.io/cause/
52 stars 15 forks source link

CAUSE's results #38

Closed giuliapontali closed 1 year ago

giuliapontali commented 1 year ago

Hi,

I am trying to compare the results obtained with cause and the ones obtained with the classical Mendelian randomization approach. Can you better explain the meaning of the models in res$elpd results?

How can we define the best model?

And also, If I run CAUSE without doing the clumping step I obtained that model 1 is better than model 2. Do you think that these results have sense?

Thanks a lot for the clarity.

jean997 commented 1 year ago

yes your interpretations are correct. I encourage you to check out the description here https://jean997.github.io/cause/ldl_cad.html#Step_5:_Look_at_Results The best fitting model is the model with the highest elpd. The z-score tells you if the difference in elpd between the two models is significant. If we use the results in the link above as an example, delta_elpd is negative on every row. This means that the elpd for the sharing model is higher than the elpd for the null model and the elpd for the causal model is higher than the elpd for the sharing model. So the causal model has the highest elpd. We can also look at the z-score and conclude that the causal model is significantly better than the sharing model.

CAUSE requires independent variants so you should not run it without clumping.

giuliapontali commented 1 year ago

Thank you.

Can you just confirm if the causal model means causality and no pleiotropy? Or causality and yes pleiotropy? Is it possible to extract theelpd for each model?

I also have another question about the results. Why doing summary(res, ci_size=0.95) I get:

       Length Class          Mode
sharing  8     cause_post     list
causal    8     cause_post     list
elpd      5     cause_elpd     list
loos      3     -none-         list
data    13     cause_data_fit list
sigma_g  1     -none-         numeric
qalpha   1     -none-         numeric
qbeta    1     -none-         numeric

Thanks