When I trained the network, I got the following warning.:
/usr/wiss/zuox/storage/slurm/zuox/miniconda3/envs/neucon/lib/python3.7/site-packages/torch/optim/lr_scheduler.py:123: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
"https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)
Does it really matter? Should I follow the notice and call them in the opposite order?
When I trained the network, I got the following warning.: /usr/wiss/zuox/storage/slurm/zuox/miniconda3/envs/neucon/lib/python3.7/site-packages/torch/optim/lr_scheduler.py:123: UserWarning: Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate "https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)Does it really matter? Should I follow the notice and call them in the opposite order?