hongjoon0805 / SS-IL-Official

34 stars 4 forks source link

The problem about reducing the training epochs. #10

Closed HenryZ94264 closed 2 years ago

HenryZ94264 commented 2 years ago

In the official code, file main.py contains code like this in line 59 to 64

if 'ft' in  args.trainer or 'ssil' in args.trainer:
    lr = args.lr / (t+1)
    if t==1:
        total_epochs = args.nepochs // args.factor
        schedule = schedule // args.factor

I'm wondering why reducing the training epochs, could you please explain the meaning of doing so? Thanks!

hongjoon0805 commented 2 years ago

This technique is introduced in IL2M [1] paper. If you reduce the training epochs after learning Task 1, a model effectively use the learned features from first tasks, and due to the small training epochs, this can reduce the overfitting to further tasks. As a result, reducing the training epochs can reduce the catastrophic forgetting in incremental learning Thanks! [1] Eden Belouadah and Adrian Popescu. Il2m: Class incremental learning with dual memory. In The IEEE International Conference on Computer Vision (ICCV), October 2019

HenryZ94264 commented 2 years ago

Thanks for your prompt reply! I'm now running SS-IL in CIFAR-100, it seems to have better score without using epoch decay.