jiawei-ren / BalancedMetaSoftmax-Classification

[NeurIPS 2020] Balanced Meta-Softmax for Long-Tailed Visual Recognition
https://github.com/jiawei-ren/BalancedMetaSoftmax
Other
135 stars 26 forks source link

Question about the balanced softmax loss #13

Closed cledor8712 closed 1 year ago

cledor8712 commented 2 years ago

Hi, thanks for your research and sharing codes. It helped me a lot.

I have a question about the balanced softmax.

I understand that balanced softmax uses prior distributions during the training phase.

For example, cross entropy + prior distribution log (I'm not sure it is the right explanation)

Then, should we use same balanced softmax loss during the test phase, too?

Or just use the normal cross entropy?

Intuitively, test distribution is different from the train distribution, so I thought we don't need to use the balanced softmax loss during

the test phase.

Thank you!

jiawei-ren commented 1 year ago

Intuitively, test distribution is different from the train distribution, so I thought we don't need to use the balanced softmax loss during the test phase.

Yes, we don't need balanced softmax in the test phase. Sorry for the late reply.