zhihou7 / BatchFormer

CVPR2022, BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning, https://arxiv.org/abs/2203.01522
246 stars 20 forks source link

About reproducing the results on CIFAR100-LT #7

Closed xinqiaozhao closed 2 years ago

xinqiaozhao commented 2 years ago

Thank you so much for sharing the codes, I have met a issue when I try to run RIDE on CIFAR100-LT as your paper report

I use the command like this: python train.py -c "./configs/config_imbalance_cifar100_ride.json" --reduce_dimension 1 --num_experts 3 --add_bt 1

I just follow the val_accuracy printed in each iteration, but the result is so bad, I think I use it in a wrong way, could you give me a hint about how to reproduce the CIFAR100-LT result? Thank you so much and it will help a lot

zhihou7 commented 2 years ago

Hi, Thanks for your interest. Have you tried to run RIDE on ImageNet-LT? It seems like I have not included the code for CIFAR100-LT. I'll check the code again.

Regards, Zhi hou

xinqiaozhao commented 2 years ago

Thank you for your quick reply, I'm running the code of ImageNet-LT, really looking forward to your code on CIFAR100-LT, that will be very helpful, Thank you~ :)

zhihou7 commented 2 years ago

Thanks. I guess I find the issue. Do you get the result only around 0.25? I add a shared classifier for CIFAR100-LT because I find it does not work on CIFAR100-LT without shared classifier, though BatchFormer without shared classifier on ImageLT and iNat works impressively.

Regards,

zhihou7 commented 2 years ago

I do not include BatchFormer for RIDE on CIFAR100-LT. I have updated the code on CIFAR100-LT. You can run that like this,

python train.py -c "./configs/config_imbalance_cifar100_ride.json" --reduce_dimension 1 --num_experts 3 --add_bt 33

Empirically, BatchFormer does not improve CIFAR100-LT on RIDE.

xinqiaozhao commented 2 years ago

thank you so much!

xinqiaozhao commented 2 years ago

Thanks. I guess I find the issue. Do you get the result only around 0.25? I add a shared classifier for CIFAR100-LT because I find it does not work on CIFAR100-LT without shared classifier, though BatchFormer without shared classifier on ImageLT and iNat works impressively.

Regards,

yes i get the result around 0.25 yesterday