torch / optim

A numeric optimization package for Torch.
Other
197 stars 154 forks source link

Problem Solved. Setting learningRate and learningRateDecay in ADAM does not work. #148

Closed muhanzhang closed 7 years ago

muhanzhang commented 7 years ago

Hi, it seems that every time the learningRate and learningRateDecay in ADAM get replaced by the default values 0.001 and 0 instead of the user-set values. Can anyone have a check?

Thanks!