yaoyao-liu / class-incremental-learning

PyTorch implementation of AANets (CVPR 2021) and Mnemonics Training (CVPR 2020 Oral)
https://class-il.mpi-inf.mpg.de
MIT License
459 stars 71 forks source link

How are the hyperparameters tuned? #29

Open ashok-arjun opened 2 years ago

ashok-arjun commented 2 years ago

Hi @yaoyao-liu,

Thanks for your wonderful work.

I have a question: How are the hyperparameters for your model set, in general (not in particular for AANets, I mean parameters such as lambda etc that determine the stability-plasiticity tradeoff).

Do you do multiple runs on the entire dataset with a lot of hyperparameter combinations sampled from a coarse grid?

Or do you determine the hyperparameter separaterly for each task?

Thank you!

yaoyao-liu commented 2 years ago

Hi @ashok-arjun,

Thanks for your interest in our work. For the hyperparameters that occurred in LUCIR, we directly follow their settings for a fair comparison. You may check their hyperparameters here.

Best, Yaoyao