Lightning-AI / torchmetrics

Machine learning metrics for distributed, scalable PyTorch applications.
https://lightning.ai/docs/torchmetrics/
Apache License 2.0
2.11k stars 403 forks source link

MeanAveragePrecision - bug in `max_detection_thresholds` #2560

Closed nisyad-ms closed 4 months ago

nisyad-ms commented 4 months ago

🐛 Bug

MeanAveragePrecision return map=-1 for any value other than 100 for max_detection_thresholds

To Reproduce

from torch import tensor
from torchmetrics.detection import MeanAveragePrecision
preds = [dict(boxes=tensor([[0, 0, 100, 100],
                      [0, 0, 50, 50]]),
              scores=tensor([1.0, 0.9]),
              labels=tensor([0, 1]))]

target = [dict(boxes=tensor([[0, 0, 100, 100],
                      [0, 0, 50, 50]]),
               labels=tensor([0, 1]),)]
metric = MeanAveragePrecision(iou_type="bbox", max_detection_thresholds=[1, 10, 50])
metric.update(preds, target)
result = metric.compute()
result  # map = -1

Expected behavior

Expected: map=1 (for 50 max detections)

Environment

SkafteNicki commented 4 months ago

Hi @nisyad-ms, thanks for reporting this issue. Sadly this is due to a known bug in the official pycocotools backend, that we use for computations. Specifically, this line: https://github.com/cocodataset/cocoapi/blob/8c9bcc3cf640524c4c20a9c40e89cb6a2f2fa0e9/PythonAPI/pycocotools/cocoeval.py#L460 should have been

stats[0] = _summarize(1, maxDets=self.params.maxDets[2])

for it to work. Sadly the repo is not really maintained anymore, but is still considered the official reference for mAP.

Instead you can install the faster-coco-eval backend (https://github.com/MiXaiLL76/faster_coco_eval) which we also supports. This backend have implemented the fix so your code calculates the correct value.

metric = MeanAveragePrecision(iou_type="bbox", max_detection_thresholds=[1, 10, 50], backend="faster_coco_eval")
metric.update(preds, target)
result = metric.compute()
print(result)

#{'map': tensor(1.), ...

Closing issue because we really cannot fix this on our side.

nisyad-ms commented 4 months ago

Thanks @SkafteNicki for the information. How to ensure the faster-coco-eval backend is used? Just installing it will do? Thanks again.

SkafteNicki commented 4 months ago

Thanks @SkafteNicki for the information. How to ensure the faster-coco-eval backend is used? Just installing it will do? Thanks again.

Sorry I should have specified that. You install the backend with pip install faster-coco-eval and then when initializing the MeanAveragePrecision class you need to set the backend argument to "faster_coco_eval".