YyzHarry / imbalanced-semi-self

[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
https://arxiv.org/abs/2006.07529
MIT License
735 stars 115 forks source link

error python pretrain_rot.py --dataset cifar10 --imb_factor 0.01 #22

Closed madoka109 closed 3 years ago

madoka109 commented 3 years ago

When I use python pretrain_rot.py --dataset cifar10 --imb_factor 0.01,it occurs RuntimeError: Given input size: (2048x1x1). Calculated output size: (2048x0x0). Output size is too small. How should I modify the code?

YyzHarry commented 3 years ago

Hi - did you modify the network architecture? For CIFAR experiments, one should use resnet32 as default architecture.

madoka109 commented 3 years ago

I understand, thank you very much. Besides, I have another question. I know the epochs of Network training with SSP models is 200. But the epochs of pretrained model is 200 or the other? And when I try to use 200 epochs to get pretrained model, I directly get the result 83.750, it seems better than the results of the paper. Do I misunderstand something? 图片

YyzHarry commented 3 years ago

I assume this accuracy is for the self-supervised pre-training task --- that is classifying rotation in this case. So it is different from the actual imbalanced classification performance.

madoka109 commented 3 years ago

I understand, thank you!