accosmin / nano

C++ library [machine learning & numerical optimization] - superseeded by libnano
MIT License
1 stars 0 forks source link

Continuation methods #113

Open accosmin opened 8 years ago

accosmin commented 8 years ago

Implement some basic continuation methods for global optimization. Use various smoothing functions: h(x) = (1 - lambda) * f(x) + lambda * g(x), where g(x) can be:

Need to add unit tests and update the benchmark programs.

accosmin commented 8 years ago

This paper looks very promising: https://arxiv.org/pdf/1503.03712.pdf This paper is also interesting (they parametrize the activation function to vary from almost linear to non-linear): https://arxiv.org/pdf/1702.00758.pdf