wvangansbeke / Unsupervised-Classification

SCAN: Learning to Classify Images without Labels, incl. SimCLR. [ECCV 2020]
https://arxiv.org/abs/2005.12320
Other
1.37k stars 267 forks source link

consistency loss and entropy loss #82

Closed ipheiman closed 2 years ago

ipheiman commented 3 years ago

Hi authors, Thank you for the brilliant work! I have tried training step 2: SCAN Clustering on my custom dataset but the consistency loss started at around 3.0 and after 300 epochs, it only decreased to 1.9, whereas entropy loss stayed relatively the same. I was wondering for the imagenet subsets training, what was the consistency and entropy loss obtained for your training.

Thanks for reading this! Cheers :)

wvangansbeke commented 3 years ago

Hi @ipheiman

Thanks. I appreciate the kind words. I don't know the exact numbers for ImageNet by heart, but 1.9 is plausible. It's not always easy to balance both loss terms. You can try reducing the entropy weight, in order to make the consistency term more important during the optimization step. This might hurt the uniformity constraint though. Maybe try running on ImageNet first if your dataset is similar. The insights in the training dynamics can be helpful for your own dataset. The configuration scripts are provided in this repo.