Open tomaszkacprzak opened 4 months ago
Welcome back @tomaszkacprzak ! $\epsilon$ is a vector, so I'm not sure how $1[\epsilon > 0]$ is to be interpreted, when multiplying $\Vert \epsilon \Vert^2$?
Is it not something like:
$L (\epsilon, \alpha) = 2 · \sum_i ( \alpha + (1 − 2\alpha) · 1[\epsilon_i > 0] ) \epsilon_i^2$
Can you give the implementation a try through a PR, like you did for positive group lasso ? Thanks a lot
Also I believe it should be $1 / 2$ instead of $2$ to get ride of the coefficient that pops up after differentiation
$L (\epsilon, \alpha) = \frac{1}{2} \sum_i ( \alpha + (1 − 2\alpha) · 1[\epsilon_i > 0] ) \epsilon_i^2$
Hi @mathurinm thank you for a fast reply. Indeed there was a mistake in my equation, the one you gave should be right. I can try a PR. I could start with a Quadratic and modify all the functions. Do you expect any difficulties for optimisation?
Yes that sounds like a plan, and no I don't foresee any difficulty!
@tomaszkacprzak could you also please let us know in which context you use skglm, and what it brings you compared to alternatives?
Hi @tomaszkacprzak, any news on this?
Description of the feature
In many applications there is a need to have an asymmetric loss that weights negative and positive residuals differently. A good function for this is double quadratic loss (quad-quad):
$L (\epsilon, \alpha) = 2 · ( \alpha + (1 − 2\alpha) · 1[\epsilon > 0] ) · || \epsilon ||_2^2$
where $\epsilon = Xw - y$ is the residual, $\alpha$ controls the degree of asymmetry and $1[·]$ is the indicator function. For $\alpha=0.5$ this function is equivalent to Quadratic datafit.
Would it be possible to add DoubleQuadratic datafit and have it compatible with the Lasso and other penalties?
Considered alternatives
Another possibility is to use Pinball loss.