Vanint / SADE-AgnosticLT

This repository is the official Pytorch implementation of Self-Supervised Aggregation of Diverse Experts for Test-Agnostic Long-Tailed Recognition (NeurIPS 2022).
MIT License
146 stars 20 forks source link

About CIFAR10-LT's Implementation details #11

Closed sunhappy6900 closed 2 years ago

sunhappy6900 commented 2 years ago

Hello, In your paper,the top-1 accuracy on CIFAR10-LT(Imbalance Ratio=10,100) is 90.8% and 83.8%,but when I run your source code,the top-1 accuracy on CIFAR10-LT(Imbalance Ratio=10,100) is 90.16% and 82.92%,What are the specific Implementation details on CIFAR10-LT? Thank you~

Vanint commented 2 years ago

Hi, if I remember correctly: On CIFAR10-LT, you need to change "reduce_dimension" to False, and adjust the learning rate to 0.05. You can have a try first for this configuration. Thanks.

sunhappy6900 commented 2 years ago

Is the learning rate always fixed at 0.05 in 200 epochs?

Vanint commented 2 years ago

No, I use the same learning rate decay strategy as CIFAR100-LT.

image

sunhappy6900 commented 2 years ago

thank you,let me have a try.

sunhappy6900 commented 2 years ago

thank you,this worked.

Huangszz commented 2 years ago

thank you,this worked.

Could you please provide the config file for CIFAR10? @sunhappy6900