lindahua / EmpiricalRisks.jl

Julia implementation of predictors and loss functions for empirical risk minimization
Other
20 stars 8 forks source link

Add regularizers that may not have a defined gradient #2

Open fredo-dedup opened 9 years ago

fredo-dedup commented 9 years ago

For some regularizers, the proximal operator is a (short) iterative calculation that does not yield a defined gradient expression : L1Ball (for the true LASSO), Simplex, ... This may be a problem for regular optimization algos but not for the proximal gradient that does not need these gradients. This could make them a useful addition to this package. I understand that the EmpiricalRisks package should be as generic as possible, but would it be possible anyway to include them ? Perhaps through a subtype of Regularizers with only a prox! method but no value_and_addgrad!() ?

If you think its worthwhile I'd be ready to contribute that extension to the package.

Thanks.

lindahua commented 9 years ago

One way is to define a prox! method for a regularizer, and then call the proximal methods to solve the problem. In this way, you don't actually have to define the value_and_addgrad! method.

I think we can add this as traits, instead of expanding the type hierarchy:

support_prox(r::SomeRegularizer) = true
support_grad(r::SomeRegularizer) = false

What do you think about this?

Also, contributions are always welcomed and appreciated.