Closed asierrl closed 2 years ago
Could you specify which algorithm you have used, e.g., by pasting the main line of code used when the warnings occurred? Most of the included algorithms do not have iterative procedures other than the component loop. For instance, using plsr() without overriding defaults is routed to kernelpls.fit which directly decomposes (per component) using SVD and not iterations. This version handles multiple responses well.
Sure, Kristian, this is the code that is giving me those warnings. proba.out <- plsr(as.formula(paste("Response", "~", "Spectra")), scale=FALSE, ncomp=5, validation="CV", trace=FALSE, data=probadata)
Yes, I suspect that the warnings belong each to each one of the components...but sometimes they are more than 50 warnings and I am asking a maximum number of components of 25.
I tested a little and could force the warning you get by overriding the default algorithm. Could you test this on your own code to see if this is the culprit?
library(pls)
# Synthetic multiresponse NIR data
data(gasoline)
probadata <- list(Response = gasoline$NIR %*% matrix(rnorm(401*10), ncol=10), Spectra=gasoline$NIR)
# Default PLSR model (kernelpls)
proba.out <- plsr(as.formula(paste("Response", "~", "Spectra")), scale=FALSE, ncomp=5, validation="CV", trace=FALSE, data=probadata)
# Force iterative algorithm
pls.options(plsralg="oscorespls")
proba.out <- plsr(as.formula(paste("Response", "~", "Spectra")), scale=FALSE, ncomp=5, validation="CV", trace=FALSE, data=probadata)
# Warnings
# Reset default algorithm
pls.options(plsralg="kernelpls")
proba.out <- plsr(as.formula(paste("Response", "~", "Spectra")), scale=FALSE, ncomp=5, validation="CV", trace=FALSE, data=probadata)
Yes, it seems to be the case. In fact, I was working systematically with the oscorespls algorithm (I was following the workflow of Burnett et al.'s paper, Journal of Experimental Botany, Vol. 72, No. 18 pp. 6175–6189, 2021). When fitting a model with 8 components (which always produced warnings) with kernelpls I get no warnings (by the way, I sent you an inproper algorithm, as with 6 components I don't always get warnings, just sometimes). Can I go with kernelpls? What are the implications and meaning. Thanks for your packages and help!!
Short version: You should be safe with the kernelpls variant. The oscorespls is included mostly for historical reasons as this is close to the original formulation from the 80's. The kernelpls theoretically should give the same results, with the same orthogonality properties. It is quicker (probably not noticable with small data) and numerically more stable. There is no iterations with and arbitrary convergence cut-off value, only SVDs and projections. Your scores, loadings and all the rest should be the same with the two algorithms, though oscorespls may be a bit off the target when it reports non-convergence.
Great, thanks a lot, Kristian!
I am getting multiple lack of convergence warnings while fitting a multiple response plsr to a data set with 6 response variables and 242 explanatory variables (NIR bands). I applied different variable transformations to each of the responses, in order to get less biased and leptokurtik distributions. The number of warnings appear to depende on the number of components used (ncomp) and on the cross-validation method (less warnings for CV than LOO), which suggests the issue is due to a lack of convergence in some of the validation fits. I always read that pls was very robust and converged almost always, and that is why finding so many "lack of fit" warnings surprises me. Despite these warnings, I still get (apparently) valid values for fitted.values, RMSEP and R2. Does that mean that those lacks of fit during validation do not affect the final model (which I doubt)? How can I deal or troubleshoot the issue? Is there a way to avoid the non-fitted models in the results? The warning I am getting is, most frequently:
Warning in fitFunc(Xtrain, Y[-seg, , drop = FALSE], ncomp, Y.add = Y.add[-seg, : No convergence in 100 iteration