Closed yuvval closed 7 years ago
@yuvval I don't think this should be the intended behaviour. You can make change the code to allow it to be user-definable in the function definition in optimizers.py
.
The version of Adam implemented here has now been superseeded, so you might want to check the latest version of the paper for more details on the default hyperparameters.
I have implemented both of these changes in my fork of the code.
Thanks @elliottd!
Hi, Looks like the Adam optimizer learning rate is not affected by the function input and is fixed to lr0=0.0002 https://github.com/kelvinxu/arctic-captions/blob/master/optimizers.py#L83
Is this an intended behavior? Thank you