Closed DelfsEngineering closed 7 years ago
It depends what you mean by LDA with Mahalanobis distance. As you can see from the description of the LDA method (by clicking on the link "more information on this topic" from the help of LDA in the online lab at http://mlweb.loria.fr/lalolab/ ), LDA is based on probability density functions of normal distributions, which naturally involve the squared Mahalanobis distance. So plain LDA might be what you are looking for.
Regarding Leave-one-out (LOO), this is implemented by the cross-validation function applicable to any Classifier/Regression model. By default, this performs a 5-fold cross-validation, but you can specify the number of folds in a 3rd argument. For training data in the matrix X and vector Y, this could be done with
model = new Classifier(LDA) RecRate = model.cv(X, Y, Y.length)
To make this easier, I just added a wrapper for that, so now you can just call
RecRate = model.loo(X, Y)
I hope this answers your questions.
We would like to use LDA with Mahalanobis Distance and Leave One Out cross validation. Can we use your code base as it is for this type of LDA?
If not, could this be added?
Cheers