Closed mechsihao closed 3 years ago
Hi - thank you for your interest!
ti_80M_selected.pickle
, you can find the file here. The link is also provided in the README.\omega
: We simply use 1 for all experiments. It's possible that a better selected \omega
can lead to better results. As for the epoch number, it is the same as the one without unlabeled data (you can refer to our code, e.g., train_semi.py
).
作者大大你好,想请教下,我看论文里整个self-training阶段的总loss为: 那这个 \omega 是怎么定的呢?还有半监督学习的总轮数epoch的确定方法。