SachaEpskamp / psychonetrics

An R package for Network Psychometrics
42 stars 5 forks source link

Error: Could not repair identification issue. Aborting search and returning previous model. #11

Closed valentinasag closed 1 year ago

valentinasag commented 2 years ago

I'm testing if..
a network model with 14 indicators fits my data (nobservations = 1008) better than a three-dimensional latent model using the ‚psychonetrics‘ package.

I get the following error message:

finalModel <- prunedModel %>% stepup Could not repair identification issue. Aborting search and returning previous model. Warnmeldungen: 1: In runmodel(emergencystart(x)) : Information matrix or implied variance-covariance matrix was not positive semi-definite. This can happen because the model is not identified, or because the optimizer encountered problems. Try running the model with a different optimizer using setoptimizer(...). 2: In runmodel(emergencystart(x)) : One or more parameters were estimated to be near its bounds. This may be indicative of, for example, a Heywood case, but also of an optimization problem. Interpret results and fit with great care. For unconstrained estimation, set bounded = FALSE. 3: In runmodel(emergencystart(x)) : Model might not have converged properly: mean(abs(gradient)) > 1.

When i run the model with 10 indicators (instead of the 14 in the scale) it still runs; doesn’t matter what combination of indicators. As of 11 indicators the thing keeps crashing. So i checked the fit indices from using 8 indicators up to using 10 and the fit becomes better and better, up tot he point that we have near perfect fit with 10 indicators. (cfi = 1.0 and RMSEA = .01). I suspect that adding the extra indicators makes the fit too perfect, which causes it to have conversion issues. I suspect this might have to do with the high inter-item correlations.

Here is the cor-matrix: 
 burnout.1 burnout.2 burnout.3 burnout.4 burnout.5 burnout.6 burnout.7 burnout.1 1.0000000 0.6448964 0.7455040 0.5533467 0.7091184 0.6816461 0.5201119 burnout.2 0.6448964 1.0000000 0.7167819 0.6742280 0.7654734 0.7371680 0.6155033 burnout.3 0.7455040 0.7167819 1.0000000 0.6669355 0.8020159 0.8133098 0.6160001 burnout.4 0.5533467 0.6742280 0.6669355 1.0000000 0.6982751 0.6929471 0.5948466 burnout.5 0.7091184 0.7654734 0.8020159 0.6982751 1.0000000 0.8097514 0.6384737 burnout.6 0.6816461 0.7371680 0.8133098 0.6929471 0.8097514 1.0000000 0.6319321 burnout.7 0.5201119 0.6155033 0.6160001 0.5948466 0.6384737 0.6319321 1.0000000 burnout.8 0.5899201 0.6406835 0.6366945 0.6091116 0.6856933 0.6630217 0.7423392 burnout.9 0.5746420 0.6715012 0.6882767 0.6470393 0.7089069 0.7235720 0.8011067 burnout.10 0.5563096 0.6614141 0.6566182 0.6693690 0.6734236 0.6708950 0.7655066 burnout.11 0.5196241 0.6050055 0.6153183 0.5753019 0.6231131 0.6530510 0.7585188 burnout.12 0.4312258 0.5564350 0.5495108 0.6141187 0.5785846 0.5772272 0.6293454 burnout.13 0.4276255 0.5760747 0.5457028 0.6371080 0.5688834 0.5760273 0.5832512 burnout.14 0.3916993 0.5286722 0.5295581 0.5871355 0.5523888 0.5894933 0.6073607 burnout.8 burnout.9 burnout.10 burnout.11 burnout.12 burnout.13 burnout.14 burnout.1 0.5899201 0.5746420 0.5563096 0.5196241 0.4312258 0.4276255 0.3916993 burnout.2 0.6406835 0.6715012 0.6614141 0.6050055 0.5564350 0.5760747 0.5286722 burnout.3 0.6366945 0.6882767 0.6566182 0.6153183 0.5495108 0.5457028 0.5295581 burnout.4 0.6091116 0.6470393 0.6693690 0.5753019 0.6141187 0.6371080 0.5871355 burnout.5 0.6856933 0.7089069 0.6734236 0.6231131 0.5785846 0.5688834 0.5523888 burnout.6 0.6630217 0.7235720 0.6708950 0.6530510 0.5772272 0.5760273 0.5894933 burnout.7 0.7423392 0.8011067 0.7655066 0.7585188 0.6293454 0.5832512 0.6073607 burnout.8 1.0000000 0.7925344 0.7741188 0.7444189 0.5981625 0.5695398 0.5718096 burnout.9 0.7925344 1.0000000 0.8119760 0.7953023 0.6437256 0.6255863 0.6442333 burnout.10 0.7741188 0.8119760 1.0000000 0.7345134 0.6185789 0.6016042 0.6301076 burnout.11 0.7444189 0.7953023 0.7345134 1.0000000 0.6151989 0.6022336 0.6387952 burnout.12 0.5981625 0.6437256 0.6185789 0.6151989 1.0000000 0.7404307 0.7831360 burnout.13 0.5695398 0.6255863 0.6016042 0.6022336 0.7404307 1.0000000 0.7461605 burnout.14 0.5718096 0.6442333 0.6301076 0.6387952 0.7831360 0.7461605 1.0000000 
 Has anyone run into similar problems? Could it be because of the high item correlations? Or any other ideas?

P.S. I'm unsure if this is the right place to post this, so if not could you tell me where to go with this issue? It seemed like it was too specific for the facebook group.

Thank you in advance.

SachaEpskamp commented 2 years ago

Could you perhaps provide a reproducible example?