YyzHarry / imbalanced-semi-self

[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
https://arxiv.org/abs/2006.07529
MIT License
735 stars 115 forks source link

Can it be used to solve the unbalanced problem of supervised learning? And How? #7

Closed Rayzlx closed 3 years ago

Rayzlx commented 3 years ago

Thanks for sharing the codes! Hope you can answer my question!

YyzHarry commented 3 years ago

Hi, thanks for your interest. The techniques are indeed designed in the context of supervised learning --- we have labels, but they are imbalanced. To apply:

For more details please refer to our paper.

YuemingJin commented 3 years ago

Hi,

Thanks for your great work! I am wondering that in the self-supervised setting, in the step (2) of supervised training, do you fix the backbone parameter (training on the step (1)) about representation learning? Or you fine-tune all the parameters of the whole network? Thanks!!

YyzHarry commented 3 years ago

Hi @YuemingJin - thanks for your interest. We did not fix the backbone parameter during the supervised training stage. Different from linear evaluation protocol in self-supervised learning, we here want to maximize the performance for an imbalanced learning task, hence we do not fix the parameters.