yueatsprograms / uda_release

Unsupervised Domain Adaptation through Self-Supervision
The Unlicense
79 stars 17 forks source link

Correct learning rate #3

Closed gyglim closed 4 years ago

gyglim commented 4 years ago

Hi Yu

I get the following warning:

torch/optim/lr_scheduler.py:122: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`.
In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`.
Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.
See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

what version of pytroch did you use? should the first learning rate be 0.1 or 0.01?

Thanks & Regards, Michael

yueatsprograms commented 4 years ago

The first lr=0.1. I am using pytorch 1.1.0. I have just been ignoring this warning message.

gyglim commented 4 years ago

Ok, great. So I guess this just means the learning rate scheduler is shifted by 1 epoch? That should make a negligible difference. Thanks for the clarification