numericalalgorithmsgroup / dfols

Python-based Derivative-Free Optimizer for Least-Squares
https://numericalalgorithmsgroup.github.io/dfols/
GNU General Public License v3.0
41 stars 16 forks source link

Regularization/generic loss function #4

Open lindonroberts opened 5 years ago

lindonroberts commented 5 years ago

Modify DFO-LS to allow different loss functions (not just sum-of-squares), when the analytic form is known, so full model can be built using first derivatives (e.g. currently, have y -> y^2, with first/second derivatives y -> 2y and y -> 2). That is, have generic support for composite functions. Link is "loss" input to scipy.optimize.least_squares.

Also, add ability to add a regularization term (with known structure), e.g. Tikhonov/ridge regression.

lindonroberts commented 5 years ago

Scipy function: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.least_squares.html

lindonroberts commented 5 years ago

A simpler option (also separately useful), would be to have weighted least-squares: either