openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
https://anomalib.readthedocs.io/en/latest/
Apache License 2.0
3.66k stars 652 forks source link

[Bug]: Image metrics seem to be inconsistent with Torchmetrics #2014

Closed lathashree01 closed 4 months ago

lathashree01 commented 5 months ago

Describe the bug

When I specify image metrics as image_metrics = ["F1Score", "Precision", "Recall", "Accuracy", "AUROC", "AUPR"]

and use it as

   engine = Engine(task=task, 
                    threshold="F1AdaptiveThreshold",
                    image_metrics=image_metrics,
                    callbacks=callbacks,
                    accelerator="gpu")

I am getting an error saying

F1Score class exists for backwards compatibility. It will be removed in v1.1. Please use BinaryF1Score from torchmetrics instead
Incorrect constructor arguments for Precision metric from TorchMetrics package.
Incorrect constructor arguments for Recall metric from TorchMetrics package.
Incorrect constructor arguments for Accuracy metric from TorchMetrics package.
.
.
.
  File "/home/user/anaconda3/envs/ad_env/lib/python3.10/site-packages/anomalib/metrics/__init__.py", line 60, in metric_collection_from_names
    metrics.add_metrics(metric_cls())
TypeError: PrecisionRecallCurve.__new__() missing 1 required positional argument: 'task'

Thanks for the help

Dataset

Folder

Model

PatchCore

Steps to reproduce the behavior

use above image metrics and run the train

OS information

OS information:

Expected behavior

I want the metrics to be calculated for the ones I have specifed

However I was able to solve this problem when I downgrade "torchmetrics==0.10.3" But I am not able to get it work for specified version as per pyproject.toml file which is "torchmetrics==1.3.2"

Screenshots

No response

Pip/GitHub

pip

What version/branch did you use?

1.1.0

Configuration YAML

callbacks = None
    backbone = "wideresnet50_2"
    layers = ('layer2', 'layer3')    # for resnet models
    number_of_neighbours = 5
    corset_ratio = 0.1

    # Create the model and engine
    model = Patchcore(backbone=backbone, 
                      layers=layers,
                      pre_trained=True, 
                      coreset_sampling_ratio=corset_ratio,
                      num_neighbors=number_of_neighbours)

    engine = Engine(task=task, 
                    threshold="F1AdaptiveThreshold",
                    image_metrics=image_metrics,
                    callbacks=callbacks,
                    accelerator="gpu")

Logs

Error as shown above

Code of Conduct

blaz-r commented 5 months ago

From the error message: TypeError: PrecisionRecallCurve.__new__() missing 1 required positional argument: 'task' I believe this is the issue of initialization of these specific metrics.

In Torchmetrics each metrics has a "task" argument, but you can use a Class instance representing this task directly. Since in case of anomaly detection we have a binary task, you'd initialize the Precision as Precision(task="binary"). But as we only want to do it by name here, you can use the equivalent BinaryPrecision.

lathashree01 commented 5 months ago

Hi @blaz-r

Thanks for your reply. but when I change to BinaryPrecision, I am getting


F1Score class exists for backwards compatibility. It will be removed in v1.1. Please use BinaryF1Score from torchmetrics instead
No metric with name BinaryPrecision found in Anomalib metrics or TorchMetrics.

my torch deps:

torch                                    2.2.2
torchaudio                               2.2.2
torchmetrics                             1.3.2
torchvision                              0.17.2

Can you pls let me know how I can resolve this? Thanks

blaz-r commented 5 months ago

That's weird, maybe Anomalib doesn't use the latest version of Torchmetrics.

blaz-r commented 5 months ago

Current main branch does have the latest version of torchmetrics, so this should work (I didn't get the chance to try it). Maybe the problem is that Precision is inside torchmetrics, but BinaryPrecision is within torchmetrics.classification. We'll need to check the code.

lathashree01 commented 5 months ago

Thanks. I will check back for your update.

If I just use Precision, I get similar error saying Incorrect constructor arguments for Precision metric from TorchMetrics package.

[Solution Update]

I could not use precision, because task argument was needed. My bad.

However, I was able to solve the problem by giving torchmetrics.classification.BinaryPrecision as classpath [metrics in dict format] for image_metrics.

Thanks