umbertogriffo / focal-loss-keras

Binary and Categorical Focal loss implementation in Keras.
278 stars 67 forks source link

Wrong computation of alpha in multiclass scenario #4

Closed ghost closed 4 years ago

ghost commented 5 years ago

Hi,

Firstly, sorry for my missclick that sent this incomplete.

Secondly, I believe you are calculating the alpha wrongly in multiclass scenario. It is just constant applied to all the cases, while in binary scenario it really does highlight imbalance scenario, with alpha and 1 - alpha. Shouldn't we therefore choose different alpha for every class, for example using class weights from scikit learn?

I believe, for multiclass scenario there should be additional parameter called weights or so, which for example could be the dictionary outputed by class weights from scikit-learn.

alpha = weights[numpy.argmax(y_true)]

This line would select proper alpha for current class. For smaller classes it would be big number, for bigger classes small number.

umbertogriffo commented 4 years ago

Hi @sob3kx ,

I agree with you and I've just commit this change. Now you need to specify α as an array, and the size of the array needs to be consistent with the number of categories, representing the corresponding weight of each category.

Thanks!

marvinquiet commented 3 years ago

Hi @umbertogriffo ,

Thank you for the contribution!

Just came to this question and wanted to confirm that whether in a multi-class, the vector of alpha should add up to 1? Because in the example, I saw there are just 0.25, 0.25, and 0.25. I don't quite understand setting them all as the same weights but not adding up to 1.