Closed dmbates closed 3 years ago
The coeftable
method is using the wrong value of the fixed-effects coefficients and the value of the standard deviations of the random effects includes a non-unit scale factor when it shouldn't. I will make those changes. The thing that I am still uncertain of is the approximate standard errors.
I guess it is time to get back to profiling to make sure that the estimates of the variability are sensible. Alternatively, it might be better to first work out a parametric bootstrap method as that is conceptually simpler.
My vote is for profiling. At least for the FE, profiling is super fast and reasonably accurate.
There's a bit of a chicken-and-egg situation here. For the fixed-effects parameters the initial step size when doing the profiling is usually based on the approximate standard error.
@dmbates My work on JellyMe4 has suggested that the standard errors are now close, but often not as close as we would want, to lme4's standard errors.
In other words, just reminding us that we should revisit this when we get profiling done.
@palday The issue of displaying the correct parameter estimates has been fixed and a test added in 5fb6f6502 so I think this can be considered closed. What do you think?
Profiling already exists as a separate issue.
I agree.
In a message to R-SIG-Mixed-Models@R-Project.org with subject "nAGQ > 1 in lme4::glmer gives unexpected likelihood" Ben Goldstein provided simulated data
The peculiar thing is that fitting a simple GLMM model by Laplace approximation and with
nAGQ=11
produces identical parameter estimates in the show method.First, the
coeftable
method is suspectso I think the results being reported are wrong and second, even the parameter estimates stored in the object are exactly the same so maybe something about the nAGQ method is off
This is using