Closed NJNUCS closed 2 years ago
Hi @NJNUCS
Sorry for the delayed response
Do you mean the LR used for fine-tuning or the initial lr? For the initial value, you can find the LR in the config file under "optimizer", note that the LR of the backbone is 1/10 of this lr value since the backbone is pretrained (see this line here). So if you want to fine-tune, I would set all learning rates to 1/10 of this value.
I looked carefully at your code and found that the model does not save the learning rate, will it have any effect if the training is aborted and the previous round is continued?