arthurdouillard / incremental_learning.pytorch

A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
MIT License
383 stars 60 forks source link

Would you share the model file for the imagenet's top 500 classes? #23

Closed JoyHuYY1412 closed 3 years ago

JoyHuYY1412 commented 4 years ago

I am not so sure, it seems it needs to train around 10 days to get the first model?

arthurdouillard commented 4 years ago

I'm very sorry but because of a stupid manipulation when I release my code open-source, I've deleted a lot of my pre-trained weights, including those of ImageNet1000.

It should take between 5 to 10 days indeed depending on your GPU.

JoyHuYY1412 commented 4 years ago

I'm very sorry but because of a stupid manipulation when I release my code open-source, I've deleted a lot of my pre-trained weights, including those of ImageNet1000.

It should take between 5 to 10 days indeed depending on your GPU.

Thank you for your reply!

JoyHuYY1412 commented 4 years ago

I have one more small question, could you please give me some advice? Trying to use your BiC model, I run

python3 -minclearn --options options/bic/bic_cifar100.yaml options/data/cifar100_3orders.yaml --initial-increment 50 --increment 10 --fixed-memory --device 6 --label bic_cifar100_50_step_10_memory_20 --data-path ./data -save task --no-benchmark -memory 2000.

But the average accuracy is a little bit low (51.02). Do I need to do some adjustments?

arthurdouillard commented 4 years ago

It's indeed much lower than my paper's results of BiC (~56).

I'll look into it this week, but your command seems fine.

JoyHuYY1412 commented 4 years ago

It's indeed much lower than my paper's results of BiC (~56).

I'll look into it this week, but your command seems fine.

Thanks again!