ShipengWang / Adam-NSCL

PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
MIT License
50 stars 2 forks source link

A small query regarding baselines #3

Closed JosephKJ closed 3 years ago

JosephKJ commented 3 years ago

Hi @ShipengWang,

Thank you for your amazing work. This is indeed great that you have shared the code too.

I had two queries, and it would be very kind of you if you could answer these:

Thanks, Joseph

ShipengWang commented 3 years ago

We reimplement all the baselines. GEM, A-GEM and MEGA are re-runned with the released code with the same hyperparameters (including the number of images).

GD-WILD: the total number of images for replay is 200 for fair memory usage.

JosephKJ commented 3 years ago

Hi @ShipengWang. Thank you for your swift response.

For GD-WILD, is it 200 images per class? Also, does this vary for CIFAR100-10, CIFAR-100-20 and MiniImageNet-25 experiments?

ShipengWang commented 3 years ago

200 images of previous tasks in total, but there are lots of wild images.