ZixuanKe / PyContinual

PyContinual (An Easy and Extendible Framework for Continual Learning)
300 stars 62 forks source link

Baseline Hypeparameters CTR #23

Closed J4nn4 closed 1 year ago

J4nn4 commented 1 year ago

Hello,

I am currently trying to reproduce the results of the paper "Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning". Thanks to your answers, I was able to get the same results as in the paper for the CTR architecture. However, I am currently trying to reproduce the results for the Adapter-BERT baseline architecture. Unfortunately, I could not find the hyperparameters for the Adapter-BERT baseline model. Could you please provide the information about the Adapter-BERT baseline hyperparameters used, specifically the bert_adapter_size?

Thank you very much in advance!

ZixuanKe commented 1 year ago

Hello,

Thank you for your interest in our project.

We use a large adapter size (2000) to ensure a fair comparison, as seen on: https://github.com/ZixuanKe/PyContinual/blob/main/src/config.py#L59

Generally, default values in config.py will be used for hyper-parameters that are not found in load_base_args.py.