This code is based on PyCIL with modifications to details such as log output. All the original logs of the experiments mentioned in the paper are located in the logs/
. The code for our proposed UPCL is located in models/upcl.py
.
If you use our method or code in your research, please consider citing the paper as follows:
@article{wang2024rethinking,
title={Rethinking Class-Incremental Learning from a Dynamic Imbalanced Learning Perspective},
author={Wang, Leyuan and Xiang, Liuyu and Wang, Yunlong and Wu, Huijia and He, Zhaofeng},
journal={arXiv preprint arXiv:2405.15157},
year={2024}
}
[MODEL NAME].json
file for global settings.[MODEL NAME].py
file (e.g., models/icarl.py
).python main.py --config=./exps/[MODEL NAME].json
hyper-parameters
When using PyCIL, you can edit the global parameters and algorithm-specific hyper-parameter in the corresponding json file.
These parameters include:
We have implemented the pre-processing of CIFAR100
, imagenet100,
Tinyimagenet,
and imagenet1000
. When training on CIFAR100
, this framework will automatically download it. When training on imagenet100/1000
and Tinyimagenet
, you should specify the folder of your dataset in utils/data.py
.
def download_data(self):
assert 0,"You should specify the folder of your dataset"
train_dir = '[DATA-PATH]/train/'
test_dir = '[DATA-PATH]/val/'