I’m working on a Confirmatory Factor Analysis (CFA) in R and ran into a snag. The initial model was fine but I wanted to improve the fit. I removed the item with the lowest R² (External7) which worked well. However when I tried to remove the next lowest (Negative2) I got an error about not being able to compute the QR decomposition of the Hessian. This suggests the optimization didn’t converge.
Here’s a simplified version of my code:
library(factorAnalysis)
data_matrix <- matrix(rnorm(1000), ncol=10)
model <- specify_cfa()
FACTOR1 -> Item1, f1i1
FACTOR1 -> Item2, f1i2
FACTOR2 -> Item3, f2i3
FACTOR2 -> Item4, f2i4
FACTOR1 <-> FACTOR1, 1
FACTOR2 <-> FACTOR2, 1
Item1 <-> Item1, e1
Item2 <-> Item2, e2
Item3 <-> Item3, e3
Item4 <-> Item4, e4
result <- run_cfa(model, data_matrix)
summary(result)
Any ideas on what might be causing this or how to fix it? I’m not sure if it’s a problem with my data or if I’m doing something wrong in the model specification.
I’ve encountered similar issues with CFA optimization before. One thing to consider is that removing items can sometimes lead to model identification problems, especially if you’re left with too few indicators per factor. It might be worth checking if your model is still properly identified after removing items.
Another approach you could try is to use a different estimator. If you’re currently using ML (maximum likelihood), you might want to experiment with WLSMV (weighted least squares mean and variance adjusted) which can sometimes be more robust, especially with categorical or non-normal data.
Also, have you looked at modification indices? They might give you insights into model misspecification that could be addressed without removing items. Sometimes adding a theoretically justifiable covariance between error terms can improve fit without sacrificing items.
Lastly, if the problem persists, you might want to consider re-evaluating your factor structure. Perhaps an exploratory factor analysis could shed light on a more stable solution for your data.
hey maxrock, sounds like u hit a tricky spot there. sometimes removing items can mess with model stability. have u tried adjusting ur starting values or increasing iterations? also, double-check ur data for any weird outliers or patterns. if nothing works, maybe stick with the previous model or try a different approach. good luck!
Hey there MaxRock56! 
Oof, that QR decomposition error can be a real head-scratcher, huh? I’ve been there too, and it’s no fun. 
Have you thought about playing around with different estimation methods? Sometimes switching from ML to something like FIML (Full Information Maximum Likelihood) can work wonders when you’re dealing with tricky convergence issues.
Also, I’m curious - how many factors and items are you working with in total? Sometimes when we start removing items, we can accidentally end up with an under-identified model without realizing it. Might be worth double-checking that you’ve still got enough items per factor to keep things stable.
Oh, and here’s a wild thought - have you tried running a bootstrap analysis? It can sometimes help pinpoint where the instability is coming from. Plus, it’s always fun to watch your computer chug through a thousand iterations, right? 
Anyway, don’t give up! CFA can be a beast, but you’ve got this. Let us know how it goes or if you need any more brainstorming!