ternaus / iglovikov_helper_functions

An unstructured set of helper functions
MIT License
59 stars 15 forks source link

Function recall_precision in metrics.map wrong #103

Open rubencart opened 3 years ago

rubencart commented 3 years ago

I think lines

        if len(gt_boxes) > 0:
            overlaps = get_overlaps(gt_boxes, box)

            max_overlap = np.max(overlaps)
            jmax = np.argmax(overlaps)

        if max_overlap >= iou_threshold:
            if gt_checked[jmax] == 0:
                tp[prediction_index] = 1.0
                gt_checked[jmax] = 1
            else:
                fp[prediction_index] = 1.0
        else:
            fp[prediction_index] = 1.0

in metrics.map.recall_precision are wrong (link). Compare e.g. to this function from another repo.

If the GT bbox with highest overlap with the current predicted box is already matched with another predicted bbox, the current prediction bbox is marked as a false positive. However, I think, if the GT bbox with 2nd highest overlap has an overlap above the IoU threshold, then the current predicted bbox should still be marked as a true positive. Hence, instead of considering only the GT bbox with highest overlap, I think all GT bboxes with overlap above the IoU threshold should be considered. Unless I'm missing something?

So I think it should be something like this.

        if len(gt_boxes) > 0:
            overlaps = get_overlaps(gt_boxes, box)

            for overlap_idx, overlap in enumerate(overlaps):
                if gt_checked[overlap_idx] == 0 and overlap > max_overlap:
                    max_overlap = overlap
                    jmax = overlap_idx

        if max_overlap >= iou_threshold:
            if gt_checked[jmax] == 0:
                tp[prediction_index] = 1.0
                gt_checked[jmax] = 1
            else:
                fp[prediction_index] = 1.0
        else:
            fp[prediction_index] = 1.0

I'll make a PR.

rubencart commented 3 years ago

@ternaus Any thoughts?