Tony-Y / pytorch_warmup

Learning Rate Warmup in PyTorch
https://tony-y.github.io/pytorch_warmup/
MIT License
386 stars 25 forks source link

UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()` #13

Closed Angryrou closed 1 year ago

Angryrou commented 1 year ago

Hi Tony, I got a similar warning as #5 on using warmup.UntunedLinearWarmup after I upgraded my Pytorch to 1.12.1

_UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate_

Would you help double-check whether I can still ignore the warning in my pytorch version?

Tony-Y commented 1 year ago

I didn't get the warning when running sample code for pytorch_lightning using PyTorch 1.12.1. (This code was made due to #8) Your problem is probably caused by a misuse. Could you try the official example in your PyTorch environment?

Angryrou commented 1 year ago

Thanks! After refactoring my code to align with the official example, my problem has been solved. You are perfectly right.