Closed dlwogns0128 closed 3 years ago
We have not experimented extensively with the learning rate schedule. The publicly released model is trained for a couple of more epochs with a lower learning rate, but we did not see a significant improvement in spite of this change. We suppose a learning rate schedule might make a significant difference if we are training on a much larger dataset. The LRS2 train set is only about 29 hours.
Hello, I have a question about training LipGAN. In the train.py code, you can set the lr to 1e-3 but there is no scheduler code to reduce learning rate. Is it right to set learning rate to 1e-3 constantly?