Closed 89douner closed 4 years ago
I don't know the reason why second learning rate was set to 0.000006 after 1 epoch, but I resolve this problem when I reinstall python, pytorch and other packages. However, this error message, "SRFBB\lib\site-packages\torch\optm\torch\lr_scheduler.py143: UserWarning: The epoch parameter in 'scheduler.step()' was not necessary and is being deprecated where possible. Please use 'scheduler.step()' to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available."
This message is just warning message, so training goes without any problems.
OK, thank you.
After 1 epoch, learning rate was set to 0.000006.
How do I fix it?