Closed ybubnov closed 6 years ago
@Avcu, hi I think you'll entertain this solution: you could specify a certain label to the metric itself:
# For binary crossentropy with 2-label output.
precision = keras_metrics.precision(label=1)
What do you think?
Hi, thanks for the quick replies. I agree, rather than making it explicitly, using label numbers is much more professional. However, to the best of my knowledge, the recall and precision are very special metrics unlike accuracy. They're specifically defined for the binary classification problems. For more information please refer to wiki page below. Shortly, they're not used for multi-classification problem unless you want to define new metrics, for example one can create 3 binary classification problem out of multi-classification problem consisting of 3 classes. (Class 1 as '1' and Class 2&Class 3 are '0', Class 2 as '1' and Class1&Class3 as '0' and so on...) I guess, that's the only case in which recall and precision make sense to use. Secondly, I think it's wrong to say 'recall for the first label' or 'precision for the second label' when there are 2 units as you can see from the page below, they're strictly defined already.
@Avcu, fixed the implementation to make the library more user-friendly.
@ybubnov looks perfect
This patch defined additional parameters of the metrics to evaluate certain class.
references #3.