JuliaAI / MLJLinearModels.jl

Generalized Linear Regressions Models (penalized regressions, robust regressions, ...)
MIT License
81 stars 13 forks source link

Quantile reg #4

Closed tlienart closed 5 years ago

tlienart commented 5 years ago

algos

ADMM, MM, and CD approaches, while MM and CD are faster and ADMM slower than the IP algorithm available in quantreg. The results so far suggest that the MM algorithm is the best-suited for non-regularized (composite) quantile regression among the four methods tested, especially for data sets with large n and relatively small p. In regularized quantile regression, all methods perform similarly in terms of variable selection, but CD and ADMM show clear superiority in run time, particularly relative to the IP and MM methods when p is large. In the case of regularized composite quantile regression, CD and ADMM dis

(...)

Applying existing optimization algorithms to (composite) quantile regression requires a non-trivial reformulation of the problem due to the non-linearity and non-differentiability of the loss and regularization terms of the objective. The well-known quantreg package for R (Koenker, 2017) uses an interior point (IP) approach for quantile and composite quantile regression with the option of l1 (lasso) regularization for the former and no regulariza- tion options for the latter. Although advanced IP algorithms in quantreg, such as the one using prediction-correction (Mehrotra, 1992) for non-regularized quantile regression, have greatly improved upon earlier attempts using simplex methods, the time spent on matrix inversion in IP approaches (Chen and Wei, 2005) motivates us to seek faster algorithms for quantile and composite quantile regression, particularly for high-dimensional data where regularization is required. In addition, following the conjectures of Fan and Li (2001), Zou (2006) showed lasso variable selection—currently the most commonly-implemented penalty for quantile regression—to be inconsistent in certain situations and presented adaptive lasso regularization as a solution. Our work in the present paper is thus motivated by both a search for faster quantile regression algorithms as well as the lack of publicly-available meth- ods for adaptive-lasso regularized quantile and composite quantile regression, particularly for high-dimensional data.

in https://arxiv.org/pdf/1709.04126.pdf (cqreg package in R)

refs

tlienart commented 5 years ago

May revisit the "faster" algorithms in some future, for the moment LBFGS and IWLS are satisfactory