G-U-N / ECCV22-FOSTER

The official implementation for ECCV22 paper: "FOSTER: Feature Boosting and Compression for Class-Incremental Learning" in PyTorch.
MIT License
51 stars 4 forks source link

log.txt #16

Closed anhongchap closed 1 year ago

anhongchap commented 1 year ago

Hello, I would like to use you as a comparative experiment. Could you please kindly improve the log.txt file of imagenet B50 10 steps? I would be very grateful if you could give me your help.

G-U-N commented 1 year ago

Did you use the 100 classes listed in the imagenet-sub? I have discussed this with others and found that might influence the results. The average accuracy at each step is [85.32, 83.4, 81.09, 79.2, 76.8, 75.52] in this setting.

anhongchap commented 1 year ago

I have classified the 100 classes you provided and conducted the comparison experiment with you, but I do not know the accuracy of your specific steps in the comparison experiment, so I would like to ask you. I mainly want to know the average accuracy of each step of imagenet100 B50-5(10 steps) and cifar100B50-10(5 steps), could you please give me? Thank you very much!

G-U-N commented 1 year ago

Hi, thank you for using our work for comparison. But I am sorry to say that the backup data was stored on my old laptop, which is currently not available. Instead, you can directly run the code in this repo to get the results, which should not cost too much time and gpu resources.

anhongchap commented 1 year ago

Because there is a certain gap between your results and your paper results, it can not reflect the accuracy of the paper very well, so I would like to ask you this, if you use the old computer when you want to trouble you to find it, thank you.