vandit15 / Class-balanced-loss-pytorch

Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples"
MIT License
784 stars 120 forks source link

why is no_of_classes needed for weights normialisation #16

Closed m-zheng closed 2 years ago

m-zheng commented 3 years ago

Hi @vandit15,

Thanks for sharing your code.

In line 73, weights = weights / np.sum(weights) * no_of_classes, why is no_of_classes included here to normalise the weights? Any help would be thankful.

Thanks

Wangbenzhi commented 2 years ago

I have the same question. Do you have resolved it?

fcakyon commented 2 years ago

It is not needed, @m-zheng @Wangbenzhi refer to https://github.com/fcakyon/balanced-loss for improved and maintained version

zhandand commented 2 years ago

The article in Section 4 has mentioned that. They normalize the sum of \alpha to C, whicH C is the total number of classes