kelvinxu / arctic-captions

960 stars 349 forks source link

A bug in setting Adam optimizer learning rate? #34

Closed yuvval closed 7 years ago

yuvval commented 8 years ago

Hi, Looks like the Adam optimizer learning rate is not affected by the function input and is fixed to lr0=0.0002 https://github.com/kelvinxu/arctic-captions/blob/master/optimizers.py#L83

Is this an intended behavior? Thank you

elliottd commented 7 years ago

@yuvval I don't think this should be the intended behaviour. You can make change the code to allow it to be user-definable in the function definition in optimizers.py.

The version of Adam implemented here has now been superseeded, so you might want to check the latest version of the paper for more details on the default hyperparameters.

I have implemented both of these changes in my fork of the code.

kelvinxu commented 7 years ago

Thanks @elliottd!