superyyzg / fast_proximal_gradient_descent_l0_regularization

Code for the paper "Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems"
Apache License 2.0
10 stars 6 forks source link

is it possible to do multi-task sparse regression? #1

Open pswpswpsw opened 4 years ago

pswpswpsw commented 4 years ago

Very interesting work!

Is it possible to do for a multi-task learning problem? Say better than multi-task lasso in terms of speed?

superyyzg commented 4 years ago

Thank you for your comment! The fast PGD method presented in this paper is for L0 regularized problem. Please refer to works about (local) linear convergence on Lasso problems if multi-task lasso is concerned. On the other hand, we are working on generalizing our methods to more non-convex problems, and new results in this direction will be posted to my website (yingzhenyang.com).