accosmin / nano

C++ library [machine learning & numerical optimization] - superseeded by libnano
MIT License
1 stars 0 forks source link

ASGD with momentum #139

Closed accosmin closed 7 years ago

accosmin commented 7 years ago

Implement an ASGD like stochastic optimizers where instead of averaging we use a tuned momentum of the past states.

accosmin commented 7 years ago

We should use very small momentum factors (e.g. in the range [1e-6, 1e-2]) to "average" a large number of iterations and thus to be close to ASGD.