mmahesh / variants-of-rmsprop-and-adagrad

SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in
https://mmahesh.github.io/show_pub1/
Other
14 stars 3 forks source link

Create ipython notebook comparing all methods #2

Open mmahesh opened 6 years ago

mmahesh commented 6 years ago

Basic version which uses just a single hyperparameter setting (individually for each optimizer) would be fine initially.