drizopoulos / GLMMadaptive

GLMMs with adaptive Gaussian quadrature
https://drizopoulos.github.io/GLMMadaptive/
59 stars 14 forks source link

Behaviour of weights in GLMMadaptive #33

Closed florianhartig closed 3 years ago

florianhartig commented 3 years ago

Hi Dimitris,

also, I had a question about the weights:

a) if they are simple multipliers to the log likelihood, why can't they be supplied per observation?

b) if weights are supplied, I assume they are ignored in the simulation? Maybe it would be good to throw a warning in this case, as the user should be made aware that the model is fit according to a different likelihood than what it simulates from. This is a problem for DHARMa, but potentially also for other packages / code that uses the simulations for inference.

Best, Florian

drizopoulos commented 3 years ago

Hi Florian,

a) The likelihood in a multilevel model is defined at the higher-level units. E.g., say we have pupils in schools, then the likelihood contribution is at the school level, not the pupil level. Hence, to my view, this works as documented. See also here: https://drizopoulos.github.io/GLMMadaptive/articles/GLMMadaptive.html#generalized-linear-mixed-models-theory

b) Yes, they are ignored. According to (a), the idea is in the estimation of the model to give different weights to different observations. But when we want to simulate from the model, I don't see why or how the weights should be used. The idea is to simulate new observations from the model using the correctly estimated parameters.

I close this for now as I believe it works as documented.

florianhartig commented 3 years ago

Hi Dimitris,

many thanks for the explanations.

About the warning: the reason is that a model fitted with weights will estimate parameters will not necessarily produce data that resembles the observed data (as we don't optimize any more the likelihood of the assumed data-generating process). As this may create problems for people that use the simulations for inferential calculations (e.g. DHARMa, but also if I would do a simulated LRT), I would prefer a warning.

In any case, I have implemented a warning in DHARMa already https://github.com/florianhartig/DHARMa/blob/90ceb371ee0f55748f606435111343e7ad9132f7/DHARMa/R/compatibility.R#L458

Cheers, Florian