Closed MLee72 closed 1 month ago
In principle I believe M2()
and other functions can use the dimension reduction technique, though this has not been implemented yet (there are stubs in the package with #TODO tags to remind me to do this at some point...). For now I believe you're stuck with the QMC approach, though know that this feature is on the long todo list. Note that setting QMC=TRUE
will use 5000 fixed quadrature by default, which should run faster than the quadpts
input with many dimensions as it doesn't explode due to the curse of dimensionality.
Hi Phil,
Thank you for your work and this great package!
I'm employing bi-factor models with the MIRT package and trying to assess the model fit with M2. I know the 'bfactor' function converges fast due to the dimensionality reduction technique, but M2 statistics take too long to be calculated, and print message for the suggestion of QMC . Isn't the technique applied to the M2 calculation? If I have to use the QMC technique, how many nodes are appropriate to reduce the computation time but retain stability and accuracy?
Here are the model and estimation codes I'm using.
############################################################ sfac <- rep(1:4, each=5) # every 5 items are loaded on each specific factor
i20.bi.mod <- "G = 1-20 S1 = 1-5 S2 = 6-10 S3 = 11-15 S4 = 16-20
" ######## --- equality constraints on the specific factor loadings
out <- mirt::bfactor(data=res, model=sfac, model2 = i20.bi.mod, itemtype = "2PL", quadpts=21, technical = list(NCYCLES = 2000))
M2.out <- mirt::M2(out, type="M2*", quadpts=21) ##############################################################
Thank you so much! Best, Minho Lee