statdivlab / corncob

Count Regression for Correlated Observations with the Beta-binomial
102 stars 22 forks source link

[Question] Robust SEs for longitudinal/repeated measures designs? #176

Closed giulianonetto closed 3 months ago

giulianonetto commented 3 months ago

Hi all!

Thank you so much for your awesome work!

Please, I have one question regarding experiments with repeated measures within each subject. I understand that in #63 there is a discussion about LRT for this type of situation, but I am not sure how that handles within-subject correlations.

In this setting, would it be appropriate to use robust=TRUE in differentialTest (or, in general, sand_vcov() method)? Something of the generic form of the example below, testing

differentialTest(formula = ~ time_point*treatment_group, formula_null = ~ 1, test = "Wald", robust = TRUE, ...)

i.e., would that be analogous to a GEE approach to longitudinal data, instead of what is mentioned in #63?

Thank you very much! Best, Giuliano

adw96 commented 3 months ago

Hi Giuliano! Unfortunately robust standard errors are not the same as addressing within-subject correlation arising from longitudinally collected measurements. Potentially you could go with

differentialTest(formula = ~ time_point*treatment_group + subject_ID, formula_null = ~ subject_ID...)

but this may involve a large number of predictors, which is not a good idea in general.

As of right now, we don't have any plans to implement more generalizations to corncob, which I'm thinking of as in "essential maintenance only" mode. Instead, we are referring folks to our new package radEmu, which addresses some of the limitations of corncob -- most notably, differential detection of taxa, but we also have a option to address within-subject correlation in radEmu. I recommend checking it out! Note that it's what we are using on all of our analysis these days 😻

That may not be what you want to hear, but I hope it at least answers your question.

giulianonetto commented 3 months ago

Cool, I will check radEmu out! Thank you so much!!