zachmayer / caretEnsemble

caret models all the way down :turtle:
Other
226 stars 75 forks source link

Adaptive LASSO with caretEnsemble #255

Open lsemployeeoftheyear opened 1 year ago

lsemployeeoftheyear commented 1 year ago

Hey all,

I'd like to run a caretStack model with Adaptive LASSO (see explanation here). Code for something like that is below. As you can see, you essentially run a ridge regression on the training data and use the coefficients from the ridge regression as the weights for lasso regression.

require(glmnet)
## Data = considering that we have a data frame named dataF, with its first column being the class
x <- as.matrix(dataF[,-1]) # Removes class
y <- as.double(as.matrix(dataF[, 1])) # Only class

## Ridge Regression to create the Adaptive Weights Vector
set.seed(999)
cv.ridge <- cv.glmnet(x, y, family='binomial', alpha=0, parallel=TRUE, standardize=TRUE)
w3 <- 1/abs(matrix(coef(cv.ridge, s=cv.ridge$lambda.min)
[, 1][2:(ncol(x)+1)] ))^1 ## Using gamma = 1
w3[w3[,1] == Inf] <- 999999999 ## Replacing values estimated as Infinite for 999999999

## Adaptive Lasso
set.seed(999)
cv.lasso <- cv.glmnet(x, y, family='binomial', alpha=1, parallel=TRUE, standardize=TRUE, type.measure='auc', penalty.factor=w3)
plot(cv.lasso)
plot(cv.lasso$glmnet.fit, xvar="lambda", label=TRUE)
abline(v = log(cv.lasso$lambda.min))
abline(v = log(cv.lasso$lambda.1se))
coef(cv.lasso, s=cv.lasso$lambda.1se)
coef <- coef(cv.lasso, s='lambda.1se')
selected_attributes <- (coef@i[-1]+1) ## Considering the structure of the data frame dataF as shown earlier

Obviously when you've just got your training data this is pretty easy, but I'd like to use this as a meta classifier, which means I'd need the weights from the prediction columns also. Not knowing the inner workings of caretList super well, does anyone have any suggestions regarding how I might train the ridge regression on the training data + predictions to get the penalty factors to pass to the meta classifier?

Thanks

zachmayer commented 1 year ago

I would write a custom adaptive lasso class, with a training function that returns an S3 class and a predict method. For extra credit, write unit tests and example and publish it as an R package to github and CRAN!

Once you have an adaptive lasso class, you can plug it into caretEnsemble as a custom model

lsemployeeoftheyear commented 1 year ago

glmnet basically has the capability to do it already thought, doesn't it? Is there any way to create a custom class and then upload it to caret along with a new tag name, the way most svm models have a tag for each kernel type?

lsemployeeoftheyear commented 1 year ago

I feel like it would make sense for someone to do the same with relaxed lasso too, now that glmnet supports that...

zachmayer commented 1 year ago

glmnet has built-in support for relaxed lasso, so it should just work out of the box with caret and caret Ensemble.

Glmnet doesn't yet support the adaptive lasso or adaptive relaxed lasso, but perhaps those could be feature requests for the glmnet maintainers!