coqui-ai / STT

🐸STT - The deep learning toolkit for Speech-to-Text. Training and deploying STT models has never been so easy.
https://coqui.ai
Mozilla Public License 2.0
2.27k stars 275 forks source link

Feature request: Remove force_initialize_learning_rate #2297

Closed JRMeyer closed 2 years ago

JRMeyer commented 2 years ago

when transfer learning, it's usually the case you set a new learning rate.

unfortunately, you have to both set the LR and force_initialize_learning_rate, or else you get the old LR and have no idea :(

I don't see any reason we should keep this flag

ghost commented 2 years ago

force_initialize_learning_rate flag determines whether you will load learning rate from checkpoint in training case. If you want to start transfer learning or fine tuning with new learning rate, this flag should be set as True, but if you stop training to update some parameters and then want to train continuously, you will want to load previous LR, so this flag should be set as False.

There are several cases when to load LR or not, so this flag exists, I think.

wasertech commented 2 years ago

It's nice to be able to decide from where to load the LR (checkpoints or --learning_rate flag). I'll close this issue and convert it to a discussion if it's alright with everyone.