learnables / learn2learn

A PyTorch Library for Meta-learning Research
http://learn2learn.net
MIT License
2.61k stars 350 forks source link

Reptile Vision example has constant meta learning rate #342

Closed mi92 closed 1 year ago

mi92 commented 2 years ago

In the following reptile example script https://github.com/learnables/learn2learn/blob/master/examples/vision/reptile_miniimagenet.py:111, the meta-learning rate has a bug: it remains constant and does not decay.

new_lr = frac_done * meta_lr + (1 - frac_done) * meta_lr

Compare this to the original reptile code: https://github.com/openai/supervised-reptile/blob/master/supervised_reptile/train.py:55 cur_meta_step_size = frac_done * meta_step_size_final + (1 - frac_done) * meta_step_size

To fix this, an additional parameter may be used, e.g. meta_lr_final (similar to the second example).

seba-1511 commented 2 years ago

Hello @mi92,

Thanks for reporting this discrepancy! Have you tried fixing by using the same final learning rate? If it works better, would you like to submit a PR for it since you found the bug?

mi92 commented 2 years ago

I haven't actually tried which works better yet, but I can gladly start a PR adding the meta_lr_final for making the decay working properly (otherwise ppl may believe they are decaying when in fact they are not)

seba-1511 commented 2 years ago

That would be great, thanks!

seba-1511 commented 1 year ago

Closing since fixed.