yxgeee / SpCL

[NeurIPS-2020] Self-paced Contrastive Learning with Hybrid Memory for Domain Adaptive Object Re-ID.
https://yxgeee.github.io/projects/spcl
MIT License
318 stars 67 forks source link

Loss cannot be converged #25

Closed CWF-999 closed 3 years ago

CWF-999 commented 3 years ago

During the training process, loss_s can quickly converge, but Loss_t has been maintained between 8-9. What could be the reason for this?

yxgeee commented 3 years ago

Yes. As I remember, the value of loss_t is larger than loss_s due to more classes (clusters+outliers) for the target domain. But it would not affect the final results. How's your training performance?

yxgeee commented 3 years ago

I will close the issue as no more response is shown.