arthurdouillard / incremental_learning.pytorch

A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
MIT License
388 stars 60 forks source link

The scaling prarameter in the classifier #66

Closed windbro98 closed 1 year ago

windbro98 commented 1 year ago

Excuse me, sir. When I'm reading your paper, I feel amazed that there is a learnable parameter, $\eta$, in the classifer.

However, when I look into the codes, I find that the only possible variable corresponding to the $\eta$ is the self.scaling. It is set to be an integer 1 in default, which means that it's not learnable.

I'd like to know that, this is set by purpose, or just a mistask?

arthurdouillard commented 1 year ago

PODNet uses FactorScalar (https://github.com/arthurdouillard/incremental_learning.pytorch/blob/0d25c2e12bde4a4a25f81d5e316751c90e6f789b/inclearn/lib/network/postprocessors.py#L26) which is a learned scalar. While initialzied at one, it's still learned.

It helps the optimization with cosine classifier.

windbro98 commented 1 year ago

Thank you, sir. Your answer really helps me a lot.