HuangxingLin123 / Learning-Rate-Dropout

Pytorch implementation of Learning Rate Dropout.
42 stars 2 forks source link

MNIST setup #1

Open ifeherva opened 4 years ago

ifeherva commented 4 years ago

I noticed that on mnist most optimizers do not reach 100% train accuracy (or oscillate) while in my implementation I get 100% train accuracy after like 10 epochs.

Is it possible to publish the code/setup for the mnist experiments as well?