accosmin / nano

C++ library [machine learning & numerical optimization] - superseeded by libnano
MIT License
1 stars 0 forks source link

All batch optimization methods should work #34

Closed accosmin closed 8 years ago

accosmin commented 8 years ago

Each batch optimization method (GD, CGD*, LBFGS) should have at least a line-search parametrization with which it ALWAYS converges for all test functions given enough iterations.

For example there is no (CGD + line-search) combo that always works in the benchmark program.

accosmin commented 8 years ago

Investigate if the line-search parameters {1e-4, 0.1} are OK. Maybe {1e-4, 0.7} as suggested in Boyd's book?!

accosmin commented 8 years ago

Double check the Goldstein-Price test function: it is the only one that fails now!

accosmin commented 8 years ago

Made all the optimizers work with a linesearch configuration for relative low convergence criteria (10**-5/-6), but they all fail when high precision is requested.

Using 10-8 criteria may be useful for reaching the local minimas with good accuracy, because there are some problems where the computed solution is farther away then 10-3 from a local minima even when using 10**-6 criteria.