tensorflow / models

Models and examples built with TensorFlow
Other
77k stars 45.78k forks source link

[deeplab] How to output AP (Average precision) value during eval ? #8570

Open cclo-astri opened 4 years ago

cclo-astri commented 4 years ago

Hi all,

I am now studying some benchmark between some latest deep learning network models. But I found some model only output AP (Average precision) / AR (Average Recall) with various IoU range.

I wonder DeepLabV3 also support to output such figures during evaluation ?

Or I mis-understood the relationship between mIoU and AP ?

Thanks.

cclo-astri commented 4 years ago

Hi all,

I just found the following function could provide different TP / FP / TN with different IoU thresholds:

And I could calculate the Precision & Recall value corresponding to different IoU thresholds.

But after getting a list of Precision & Recall value pairs, I do not know what should I do to convert these value pairs to AP values. I only understand the concept which I should calculate the smoothed area under the plot of Precision vs Recall.

Is there anyone can shed some light on it ?

Thanks.

cclo-astri commented 4 years ago

Hi all,

Any updates ?

Thanks.

cclo-astri commented 4 years ago

Hi all,

Any updates ?

Thanks.

pn12 commented 4 years ago

Add the below in your code to evaluate precison / recall values at different IOU thresholds -

from object_detection.protos import eval_pb2 eval_config = eval_pb2.EvalConfig() eval_config.metrics_set.extend(['coco_detection_metrics'])

for the precision , recall and iou at different thresholds

The output would be in the below format -

Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.519 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.501 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.387 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.594

I have referred this doc - https://github.com/tensorflow/models/blob/da23acba8ecb8c0e7c9a83cdb9f10092895c9dcc/research/object_detection/g3doc/evaluation_protocols.md and https://github.com/cocodataset/cocoapi/blob/master/PythonAPI/pycocoEvalDemo.ipynb

for the same.