Debatrix / UPCL

Rethinking Class Incremental Learning from a Dynamic Imbalanced Learning Perspective
0 stars 0 forks source link

Rethinking Class Incremental Learning from a Dynamic Imbalanced Learning Perspective


Introduction

This code is based on PyCIL with modifications to details such as log output. All the original logs of the experiments mentioned in the paper are located in the logs/. The code for our proposed UPCL is located in models/upcl.py.

If you use our method or code in your research, please consider citing the paper as follows:

@article{wang2024rethinking,
  title={Rethinking Class-Incremental Learning from a Dynamic Imbalanced Learning Perspective},
  author={Wang, Leyuan and Xiang, Liuyu and Wang, Yunlong and Wu, Huijia and He, Zhaofeng},
  journal={arXiv preprint arXiv:2405.15157},
  year={2024}
}

Dependencies

  1. torch 1.81
  2. torchvision 0.6.0
  3. tqdm
  4. numpy
  5. scipy

Run experiment

  1. Edit the [MODEL NAME].json file for global settings.
  2. Edit the hyper-parameters in the corresponding [MODEL NAME].py file (e.g., models/icarl.py).
  3. Run: python main.py --config=./exps/[MODEL NAME].json
  4. hyper-parameters

When using PyCIL, you can edit the global parameters and algorithm-specific hyper-parameter in the corresponding json file.

These parameters include:

Datasets

We have implemented the pre-processing of CIFAR100, imagenet100, Tinyimagenet, and imagenet1000. When training on CIFAR100, this framework will automatically download it. When training on imagenet100/1000 and Tinyimagenet, you should specify the folder of your dataset in utils/data.py.

    def download_data(self):
        assert 0,"You should specify the folder of your dataset"
        train_dir = '[DATA-PATH]/train/'
        test_dir = '[DATA-PATH]/val/'