Closed crcrpar closed 5 years ago
As stated in the comment, it was a workaround for lr_scheduler.
If you use an optimizer with a lr_scheduler
, the values of group['lr']
would change as time step goes. You may refer to the source code of PyTorch lr_scheduler
for the detail.
Thank you very much.
IIRC, because
group['lr']
will never be changed, sofinalr_lr
will always be the same asgroup['final_lr']
. Is this intended? https://github.com/Luolc/AdaBound/blob/6fa826003f41a57501bde3e2baab1488410fe2da/adabound/adabound.py#L110