liyunsheng13 / BDL

MIT License
222 stars 30 forks source link

About the early stop in SSL training #34

Closed JialeTao closed 4 years ago

JialeTao commented 4 years ago

Hi, thanks for sharing the code. Noted that in SSL training, you set it at 120000 iterations for early stopping and choose the model at that iteration for next stage. In the paper it got 47.2 mIoU from initial 42.7 mIoU by 1-stage SSL training and then from 44.3 to 48.5. Did you obeserve the mIoUs in mid-iterations in the training process? Such as in 60000 iteraions or others.

liyunsheng13 commented 4 years ago

I don't recommend you to stop training in 60000 iterations or so. When I did experiments, I found 120000 often achieved best performance. But I admit there is some fluctuation. Sometime, 80000 or 100000 can be as good as 120000. But it doesn't influence a lot on the final results if you choose to train the model with 120000 iterations