accosmin / nano

C++ library [machine learning & numerical optimization] - superseeded by libnano
MIT License
1 stars 0 forks source link

Time averaging stochastic updates #89

Closed accosmin closed 8 years ago

accosmin commented 8 years ago

Use an exponential running average for parameters (like described in SGA/SIA or ADAM papers) to:

Goal: smooth out SG or AG updates

accosmin commented 8 years ago

All stochastic optimizers use a running exponential average of their updates. This results in significantly less noisy evolutions.