Open bc-bytes opened 1 year ago
Just to add to this, even with the lr_scheduler set, the learning rate never seems to change during training.
Hello @bc-bytes, firstly, thank you for bringing up the issue. The library can not support the "REDUCELRONPLATEAU" and "COSINEANNEALINGWARMRESTARTS" lr schedulers. However, all other schedulers are supported. Therefore, receiving this error message is quite normal. If you use either of these two schedulers, the learning rate will not change. I intend to add additional support soon, thus making all lr scheduler classes available. Until then, you may use all other schedulers except for these two.
Supported Schedulers:
Many thanks for the suggestions. However, I am currently producing a set of baseline results from other repositories, and I have trained all those models using ReduceLROnPlateau, so changing the scheduler type would mean that the results would not be comparable to my previous ones.
In trainer.py (line 101) there is a missing argument. Here's how I define optimiser and lr_scheduler in train.py:
That throws the following error:
If I then set line 101 to
self.lr_scheduler.step(loss)
that seems to fix the error. However, when I start training I get this:I haven't seen that before when training models with code from other repos. If that is normal, then all is OK, I just wanted to report the missing argument error in trainer.py.