Closed JRMeyer closed 2 years ago
force_initialize_learning_rate
flag determines whether you will load learning rate from checkpoint in training case.
If you want to start transfer learning or fine tuning with new learning rate, this flag should be set as True, but if you stop training to update some parameters and then want to train continuously, you will want to load previous LR, so this flag should be set as False.
There are several cases when to load LR or not, so this flag exists, I think.
It's nice to be able to decide from where to load the LR (checkpoints or --learning_rate
flag).
I'll close this issue and convert it to a discussion if it's alright with everyone.
when transfer learning, it's usually the case you set a new learning rate.
unfortunately, you have to both set the LR and
force_initialize_learning_rate
, or else you get the old LR and have no idea :(I don't see any reason we should keep this flag