facebookresearch / schedule_free

Schedule-Free Optimization in PyTorch
Apache License 2.0
1.54k stars 49 forks source link

Retrieve LR #16

Closed bhack closed 2 months ago

bhack commented 2 months ago

What is the best way to retrieve the LR for this optimizer at each train step to plot an LR curve without the LR scheduler?

Is it similar to: https://discuss.pytorch.org/t/get-current-lr-of-optimizer-with-adaptive-lr/24851/2

adefazio commented 2 months ago

The LR doesn't change during optimization. It achieves convergence by averaging rather than decreasing the learning rate.