hshustc / CVPR19_Incremental_Learning

Learning a Unified Classifier Incrementally via Rebalancing
183 stars 44 forks source link

Adaptive weight of the LC loss #4

Open wonda opened 4 years ago

wonda commented 4 years ago

Hello, I found the calculation of the adaptive weight of less-forget constraint different from the description in the paper. Did I misunderstand this part or miss some details? https://github.com/hshustc/CVPR19_Incremental_Learning/blob/e5a90aed7640f3b00e5b1a8dfb5376c1628bfe6a/cifar100-class-incremental/class_incremental_cosine_cifar100.py#L207

cxy1996 commented 4 years ago

Hello, I think there is no problem with the adaptive weight, the out_features1+out_features2 is the old classes number in the current session and the args.nb_cl is the novel classes number in the current session. The same as the description in the paper.

But I can't understand the use of class_mean in #5, do you think there is any problem?

wonda commented 4 years ago

@cxy1996 Thanks for your reply. In eq. (7) of the paper, the number of novel classes is the numerator, but in the code it is the denominator. Maybe there is a mistake in that equation.

cxy1996 commented 4 years ago

@wonda You're right. I missed that.

JoyHuYY1412 commented 4 years ago

Hello, I found the calculation of the adaptive weight of less-forget constraint different from the description in the paper. Did I misunderstand this part or miss some details? https://github.com/hshustc/CVPR19_Incremental_Learning/blob/e5a90aed7640f3b00e5b1a8dfb5376c1628bfe6a/cifar100-class-incremental/class_incremental_cosine_cifar100.py#L207

I also cannot understand this. And I found the result of this code for our-CNN is a little bit lower than our-NME. I don't know is it because I only run 1 run.