yassouali / CCT

:page_facing_up: Semi-Supervised Semantic Segmentation with Cross-Consistency Training (CVPR 2020).
https://yassouali.github.io/cct_page/
MIT License
395 stars 58 forks source link

Supervised Baseline / SSL-Method Possible Number of Epochs mismatch #75

Open LouisW2202 opened 6 days ago

LouisW2202 commented 6 days ago

If I use 10% of the data as labeled, the supervised loader contains 1/10 of the training data, while the unsupervised loader contains 9/10. In the code, both loaders are zipped together, and the supervised loader is repeated 9 times to ensure that their lengths match. Consequently, at the end of one epoch, the supervised loader is utilized 9 times.

Therefore, when training the supervised baseline, the number of epochs should be adjusted accordingly. Specifically, I should train the supervised baseline for 9 times the number of epochs used for the SSL setting. Since there is nothing in the code to enforce this adjustment, and your work only mentions, "For optimization, we train for 50 epochs," I would like to know if you considered this issue in your work. Additionally, in the experiments where you compare the SSL-trained model with the supervised baseline, did you ensure that the number of supervised epochs matched?

Very interesting work anyway, I look forward to your insights on this matter.