JuliaStats / GLMNet.jl

Julia wrapper for fitting Lasso/ElasticNet GLM models using glmnet
Other
95 stars 35 forks source link

Relaxed Lasso #58

Open azev77 opened 3 years ago

azev77 commented 3 years ago

The GLMNet package includes the Relaxed Lasso option, which recent research has shown performs very well. Would it be possible for GLMNet.jl to allow this?

JackDunnNZ commented 3 years ago

I had a quick look through how this is implemented in the R package, and it looks like the logic for the relaxed option sits in the R code, rather than in the core fortran library. So unfortunately it looks like we can't simply access the relaxed option from the core compiled library, instead this R logic would need to be duplicated into the Julia package which is a bigger undertaking.

azev77 commented 3 years ago

I see, That means it’s likely to be faster in the julia version

AdaemmerP commented 2 years ago

Thanks for the great package! Would it be possible to change 'CompressedPredictorMatrix' to a mutable struct? This would allow modifying the predicted values and implementing the relaxed lasso.

JackDunnNZ commented 2 years ago

I think that should be fine, although it might be better to update the code to use a generic sparse matrix instead rather than the custom struct. I'm not too familiar with the internals of the package but it feels like that should be possible?

AdaemmerP commented 2 years ago

Yes, a sparse matrix might be better to save the parameters. Regarding the struct, I think line 90 should be changed to a mutable struct: https://github.com/JuliaStats/GLMNet.jl/blob/master/src/GLMNet.jl Or is there any other possibility to change and save the values? I want to modify the parameters and then use them with GLMNet.predict()

JackDunnNZ commented 2 years ago

Or is there any other possibility to change and save the values?

You may be able to use Setfield.jl or Accessors.jl to easily update the GLMNetPath with new coefficients, something like

new_path = @set path.betas = new_betas
AdaemmerP commented 2 years ago

Nice, thanks for the tip!

azev77 commented 2 years ago

@AdaemmerP did you have any luck implementing the relaxed Lasso?

AdaemmerP commented 2 years ago

@azev77 Yes, I was able to implement but within a time series framework (https://github.com/AdaemmerP/DetectSparsity/blob/main/CaseStudies/Functions.jl, lines 337 - 501). I also used the Lasso.jl package for it.