PaddlePaddle / PaddleClas

A treasure chest for visual classification and recognition powered by PaddlePaddle
Apache License 2.0
5.5k stars 1.17k forks source link

Feature Request: Extend Confusion Matrix Support in to Single-Label Classification #3219

Closed aboccag closed 3 months ago

aboccag commented 3 months ago

Description:

I would like to request an enhancement to the existing AccuracyScore metric in the PaddleClas framework to fully support single-label classification tasks. Currently, the AccuracyScore metric, which is part of the MultilabelMetric class, includes an implementation of the confusion matrix. This implementation works well for multi-label classification tasks but does not function correctly for single-label classification scenarios.

When attempting to use the AccuracyScore metric with a single-label classification model, the following error is encountered:

ValueError: Classification metrics can't handle a mix of multiclass and multilabel-indicator targets

This error occurs because the current implementation is tailored for multi-label scenarios, leading to issues when applied to single-label tasks.

Use Case:

This feature is important for users who:

Current Limitation:

The AccuracyScore metric is designed for multi-label classification and works correctly in that context with the following configuration:

Metric:
  Train:
    - AccuracyScore:
  Eval:
    - AccuracyScore:

However, when this metric is applied to single-label classification models, it results in the error mentioned above. This limits its utility in typical classification tasks where only one label is assigned to each instance.

Expected Enhancement:

The proposed enhancement should:

Additional Context on Confusion Matrix

Here’s an example usage of the confusion matrix from scikit-learn: (from this source)

from sklearn.metrics import confusion_matrix

y_true = ["cat", "ant", "cat", "cat", "ant", "bird"]
y_pred = ["ant", "ant", "cat", "cat", "ant", "cat"]
cm = confusion_matrix(y_true, y_pred, labels=["ant", "bird", "cat"])

This function computes a confusion matrix to evaluate the accuracy of a classification. The matrix indicates how many samples were correctly or incorrectly classified across each category.

Parameters:

Returns:

Example Usage:

After the enhancement, users should be able to use the AccuracyScore metric with single-label classification models in the same way as with multi-label models:

Metric:
  Train:
    - AccuracyScore:
  Eval:
    - AccuracyScore:

This configuration should work correctly for both single-label and multi-label classification tasks.

liuhongen1234567 commented 3 months ago

Hello, multilabel_confusion_matrix in AccuracyScore is designed for mutilabel class task, if you want to calculate confusion matrix
, please use confusion_matrix instead of multilabel_confusion_matrix.

liuhongen1234567 commented 3 months ago

image You can modify this code block in PaddleClas-develop/PaddleClas-develop/ppcls/metric/metrics.py according to sklearn document

aboccag commented 3 months ago

Many thanks!