kaidic / LDAM-DRW

[NeurIPS 2019] Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss
https://arxiv.org/pdf/1906.07413.pdf
MIT License
647 stars 116 forks source link

Can not achieve similar results For Tiny ImageNet #14

Open MapleLeafKiller opened 3 years ago

MapleLeafKiller commented 3 years ago

Thanks for your paper and your code, they are great work and help me a lot. I did experiments on tiny imagenet dataset following the settings revealed on your paper, howerer i can't achieve similar results, for long tailed 1:100 tiny imagenet, the top-1 validation error I got is: ERM SGD: 80.05 LDAM SGD: 72.8 It has a big gap with the results showed in your paper. So I wonder if there is any setting or trick I have missed? In the you mentioned:<We perform 1 crop test with the validation images.> I wonder how it is done specifically. For ResNet-18, I use: backbone = models.resnet18(pretrained=True) backbone.avgpool = nn.AdaptiveAvgPool2d(1) num_ftrs = backbone.fc.in_features if USE_NORM: backbone.fc = NormedLinear(num_ftrs, 200) else: backbone.fc = nn.Linear(num_ftrs, 200) Is it correct? Looking forward to your reply, thank you very much!