accelerated proximal gradient descent with backtracking line search—wicked fast!
spectral regularization with a nuclear norm penalty (controlled by parameter \beta) to encourage low-rank solutions (closes #14)*
* combining fused LASSO (\lambda * \alpha > 0) with spectral regularization (\beta > 0) is not implemented, since that will require a proximal operator for the combined fused lasso and nuclear norm term (neither being differentiable). The current implementation allows a (differentiable) l2 penalty on the first time derivative (\lambda > 0, \alpha = 0) in combination with spectral regularization (\beta > 0). Fused LASSO is still available without spectral regularization (\beta = 0).
Some exciting updates:
* combining fused LASSO (\lambda * \alpha > 0) with spectral regularization (\beta > 0) is not implemented, since that will require a proximal operator for the combined fused lasso and nuclear norm term (neither being differentiable). The current implementation allows a (differentiable) l2 penalty on the first time derivative (\lambda > 0, \alpha = 0) in combination with spectral regularization (\beta > 0). Fused LASSO is still available without spectral regularization (\beta = 0).