yassouali / CCT

:page_facing_up: Semi-Supervised Semantic Segmentation with Cross-Consistency Training (CVPR 2020).
https://yassouali.github.io/cct_page/
MIT License
395 stars 58 forks source link

About checkpoint #61

Closed NJNUCS closed 2 years ago

NJNUCS commented 2 years ago

I looked carefully at your code and found that the model does not save the learning rate, will it have any effect if the training is aborted and the previous round is continued?

yassouali commented 2 years ago

Hi @NJNUCS

Sorry for the delayed response

Do you mean the LR used for fine-tuning or the initial lr? For the initial value, you can find the LR in the config file under "optimizer", note that the LR of the backbone is 1/10 of this lr value since the backbone is pretrained (see this line here). So if you want to fine-tune, I would set all learning rates to 1/10 of this value.