ultralytics / yolov3

YOLOv3 in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
10.16k stars 3.44k forks source link

COCOAPI mAP results and test.py misalign / mismatch #915

Closed AlphaGoMK closed 4 years ago

AlphaGoMK commented 4 years ago

🐛 Bug

A clear and concise description of what the bug is.
Thanks for your awesome project!👏👏 The map results produced by cocoapi and test.py are mismatched at a large margin.

To Reproduce

Steps to reproduce the behavior:

  1. Download official pretrained model at pjreddie's webpage
  2. Run python test.py --weights yolov3-spp.weights --data ../../data/coco2017.data --cfg cfg/yolov3-spp.cfg --save-json

Expected behavior

A clear and concise description of what you expected to happen. The mAPs produced by two methods should equal or close to each other.

Environment

If applicable, add screenshots to help explain your problem.

(pytorch)  ⚡ root@lab  master ●  python test.py --weights yolov3-spp.weights --data ../../data/coco2017.data --cfg cfg/yolov3-spp.cfg --save-json
Namespace(batch_size=32, cfg='cfg/yolov3-spp.cfg', conf_thres=0.001, data='../../data/coco2017.data', device='', img_size=416, iou_thres=0.6, save_json=True, single_cls=False, task='test', weights='yolov3-spp.weights')
Using CUDA device0 _CudaDeviceProperties(name='GeForce RTX 2080 Ti', total_memory=10989MB)
           device1 _CudaDeviceProperties(name='GeForce RTX 2080 Ti', total_memory=10989MB)

Caching labels (4952 found, 0 missing, 48 empty, 0 duplicate, for 5000 images): 100%|████████████████████████████████████████████████████████████████████████████████| 5000/5000 [00:00<00:00, 12290.42it/s]
               Class    Images   Targets         P         R   mAP@0.5        F1: 100%|███████████████████████████████████████████████████████████████████████████████████| 157/157 [02:28<00:00,  1.55it/s]
                 all     5e+03  3.68e+04    0.0088   0.00419  0.000991   0.00551
              person     5e+03   1.1e+04   0.00658   0.00418   0.00046   0.00511
             bicycle     5e+03       316         0         0  0.000672         0
                 car     5e+03  1.93e+03   0.00657   0.00362  0.000587   0.00467
          motorcycle     5e+03       371   0.00541    0.0027   0.00127    0.0036
            airplane     5e+03       143    0.0095   0.00699  0.000666   0.00806
                 bus     5e+03       285   0.00485   0.00351  0.000342   0.00407
               train     5e+03       190         0         0     4e-06         0
               truck     5e+03       415         0         0  0.000265         0
                boat     5e+03       430         0         0   0.00084         0
       traffic light     5e+03       637   0.00354   0.00157  0.000196   0.00217
        fire hydrant     5e+03       101         0         0  3.85e-05         0
           stop sign     5e+03        75         0         0         0         0
       parking meter     5e+03        60         0         0         0         0
               bench     5e+03       413         0         0  0.000145         0
                bird     5e+03       440   0.00494   0.00227  0.000613   0.00311
                 cat     5e+03       202   0.00552   0.00495   7.1e-05   0.00522
                 dog     5e+03       218         0         0  4.59e-05         0
               horse     5e+03       273         0         0  0.000179         0
               sheep     5e+03       361    0.0222    0.0139   0.00218    0.0171
                 cow     5e+03       380   0.00795   0.00526   0.00069   0.00633
            elephant     5e+03       255   0.00918   0.00784  0.000519   0.00846
                bear     5e+03        71         0         0         0         0
               zebra     5e+03       268    0.0149    0.0112   0.00103    0.0128
             giraffe     5e+03       232   0.00542   0.00431  0.000592    0.0048
            backpack     5e+03       371         0         0  0.000188         0
            umbrella     5e+03       413    0.0107   0.00484  0.000588   0.00666
             handbag     5e+03       540    0.0122   0.00185  0.000441   0.00322
                 tie     5e+03       254         0         0   0.00021         0
            suitcase     5e+03       303   0.00769    0.0033  0.000997   0.00462
             frisbee     5e+03       115         0         0         0         0
                skis     5e+03       241         0         0  0.000555         0
           snowboard     5e+03        69         0         0  2.08e-05         0
         sports ball     5e+03       263    0.0162    0.0114  0.000326    0.0134
                kite     5e+03       336   0.00519   0.00298  0.000393   0.00378
        baseball bat     5e+03       146         0         0   0.00128         0
      baseball glove     5e+03       148         0         0  0.000122         0
          skateboard     5e+03       179         0         0  0.000141         0
           surfboard     5e+03       269   0.00778   0.00372  0.000302   0.00503
       tennis racket     5e+03       225         0         0  8.64e-05         0
              bottle     5e+03  1.02e+03   0.00408   0.00195  0.000274   0.00264
          wine glass     5e+03       343   0.00714   0.00292   0.00107   0.00414
                 cup     5e+03       899   0.00214   0.00111  0.000478   0.00146
                fork     5e+03       215         0         0  0.000165         0
               knife     5e+03       326    0.0431    0.0092   0.00219    0.0152
               spoon     5e+03       253         0         0  0.000445         0
                bowl     5e+03       626   0.00688   0.00319  0.000409   0.00436
              banana     5e+03       379    0.0327   0.00792   0.00368    0.0127
               apple     5e+03       239    0.0245   0.00837   0.00279    0.0125
            sandwich     5e+03       177         0         0   0.00155         0
              orange     5e+03       287    0.0637    0.0348   0.00785     0.045
            broccoli     5e+03       316    0.0251   0.00633   0.00347    0.0101
              carrot     5e+03       371    0.0453    0.0108   0.00243    0.0174
             hot dog     5e+03       127     0.135     0.063    0.0145    0.0859
               pizza     5e+03       285   0.00587   0.00351   0.00099   0.00439
               donut     5e+03       338    0.0652    0.0414   0.00791    0.0507
                cake     5e+03       316    0.0137   0.00633    0.0029   0.00866
               chair     5e+03  1.79e+03   0.00711   0.00223  0.000712    0.0034
               couch     5e+03       261    0.0142   0.00766  0.000186   0.00994
        potted plant     5e+03       343         0         0   0.00015         0
                 bed     5e+03       163    0.0101   0.00613  0.000186   0.00763
        dining table     5e+03       697         0         0  0.000119         0
              toilet     5e+03       179         0         0   0.00022         0
                  tv     5e+03       288         0         0  3.59e-05         0
              laptop     5e+03       231    0.0123   0.00866  0.000872    0.0102
               mouse     5e+03       106         0         0  6.83e-05         0
              remote     5e+03       283         0         0  0.000159         0
            keyboard     5e+03       153         0         0  0.000219         0
          cell phone     5e+03       262         0         0  0.000124         0
           microwave     5e+03        55         0         0  2.46e-05         0
                oven     5e+03       143         0         0  0.000135         0
             toaster     5e+03         9         0         0         0         0
                sink     5e+03       225    0.0101   0.00444  0.000468   0.00617
        refrigerator     5e+03       126         0         0  0.000382         0
                book     5e+03  1.16e+03   0.00295  0.000861  0.000144   0.00133
               clock     5e+03       267         0         0  3.18e-06         0
                vase     5e+03       277   0.00683   0.00361  0.000488   0.00472
            scissors     5e+03        36         0         0  7.65e-05         0
          teddy bear     5e+03       191         0         0   0.00211         0
          hair drier     5e+03        11         0         0   0.00109         0
          toothbrush     5e+03        57         0         0   0.00122         0
Speed: 8.5/0.8/9.3 ms inference/NMS/total per 416x416 image at batch-size 32

COCO mAP with pycocotools...
loading annotations into memory...
Done (t=0.77s)
creating index...
index created!
Loading and preparing results...
DONE (t=5.20s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *bbox*
DONE (t=112.24s).
Accumulating evaluation results...
DONE (t=11.89s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.404
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.672
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.425
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.210
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.440
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.592
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.312
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.505
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.550
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.360
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.603
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.721

Desktop (please complete the following information):

Additional context

I've tried with commit #d55dbc1 and #65eeb1b , both of them got same issue.
I modified paths in coco2017.data and cocoGt path in test.py, the others remain unchanged.
Could you please provide any advice about this problem or any supply material about model evalution.
Thanks in advance :)

glenn-jocher commented 4 years ago

@AlphaGoMK your command should work properly, except that most original training was done with COCO 2014, so you want to use coco2014.data to test, not coco2017.data. Note it will download pjreddie's original *.weights files automatically. When we run this code here everything seems in order.

Note that test.py natively returns mAPs that are slightly below pycocotools, but the difference is generally < 1%. Also note that the best performing yolo model is yolov3-spp-ultralytics.pt, trained from scratch with this repo. See https://github.com/ultralytics/yolov3#map

git clone https://github.com/ultralytics/yolov3
bash yolov3/data/get_coco2014.sh
cd yolov3
python3 test.py --weights yolov3-spp.weights -cfg yolov3-spp.cfg --data coco2014-data --save-json

Namespace(batch_size=32, cfg='yolov3-spp.cfg', conf_thres=0.001, data='data/coco2014.data', device='', img_size=416, iou_thres=0.6, save_json=True, single_cls=False, task='test', weights='yolov3-spp.weights')
Using CUDA device0 _CudaDeviceProperties(name='Tesla P100-PCIE-16GB', total_memory=16280MB)

Downloading https://drive.google.com/uc?export=download&id=16lYS4bcIdM2HdmyJBVDOvt3Trx6N3W2R as yolov3-spp.weights... Done (4.7s)
Caching labels (4954 found, 46 missing, 0 empty, 0 duplicate, for 5000 images): 100% 5000/5000 [00:00<00:00, 10020.79it/s]
               Class    Images   Targets         P         R   mAP@0.5        F1: 100% 157/157 [02:47<00:00,  1.75it/s]
                 all     5e+03  3.51e+04     0.824     0.401     0.566      0.52
              person     5e+03  1.05e+04     0.879     0.558     0.736     0.683
             bicycle     5e+03       313     0.891     0.262      0.47     0.405
                 car     5e+03  1.64e+03     0.835     0.425     0.596     0.564
          motorcycle     5e+03       388     0.936     0.455     0.671     0.612
            airplane     5e+03       131     0.967     0.681     0.851     0.799
                 bus     5e+03       259     0.946     0.676     0.838     0.788
               train     5e+03       212     0.927     0.718      0.83     0.809
               truck     5e+03       352     0.824     0.346     0.579     0.488
                boat     5e+03       458     0.818     0.256     0.477      0.39
       traffic light     5e+03       516     0.786     0.329     0.484     0.464
        fire hydrant     5e+03        83         1     0.751     0.853     0.857
           stop sign     5e+03        84     0.929     0.679     0.788     0.784
       parking meter     5e+03        59     0.825     0.399     0.511     0.538
               bench     5e+03       471     0.814     0.195     0.341     0.314
                bird     5e+03       453     0.774     0.362     0.481     0.493
                 cat     5e+03       195     0.837      0.71     0.783     0.768
                 dog     5e+03       223      0.92     0.668     0.837     0.774
               horse     5e+03       305     0.932     0.633     0.813     0.754
               sheep     5e+03       306     0.887     0.507     0.713     0.645
                 cow     5e+03       376     0.869     0.503     0.678     0.637
            elephant     5e+03       283      0.93     0.801     0.905     0.861
                bear     5e+03        53     0.933     0.698     0.893     0.799
               zebra     5e+03       275     0.947     0.716     0.872     0.816
             giraffe     5e+03       170     0.985     0.792     0.909     0.878
            backpack     5e+03       384     0.795     0.132     0.334     0.226
            umbrella     5e+03       387     0.843     0.359     0.587     0.504
             handbag     5e+03       483     0.666    0.0742     0.206     0.133
                 tie     5e+03       290     0.766     0.327     0.482     0.458
            suitcase     5e+03       309     0.814     0.353     0.576     0.492
             frisbee     5e+03       109     0.945     0.626     0.798     0.753
                skis     5e+03       281      0.76     0.158     0.389     0.262
           snowboard     5e+03        90     0.814     0.367     0.528     0.506
         sports ball     5e+03       233     0.854     0.528     0.622     0.652
                kite     5e+03       381     0.722     0.457     0.563      0.56
        baseball bat     5e+03       123     0.934     0.374     0.581     0.534
      baseball glove     5e+03       139       0.8     0.374     0.466      0.51
          skateboard     5e+03       215     0.909     0.559     0.706     0.692
           surfboard     5e+03       266     0.869     0.375     0.613     0.523
       tennis racket     5e+03       183     0.937      0.57     0.724     0.709
              bottle     5e+03       939     0.737     0.284     0.454      0.41
          wine glass     5e+03       363     0.858      0.32      0.51     0.466
                 cup     5e+03       891     0.822     0.327     0.498     0.468
                fork     5e+03       234      0.87     0.218     0.411     0.349
               knife     5e+03       290     0.656     0.138     0.275     0.228
               spoon     5e+03       253      0.75    0.0949     0.235     0.168
                bowl     5e+03       617     0.747     0.321     0.493     0.449
              banana     5e+03       359     0.814     0.184     0.379       0.3
               apple     5e+03       158     0.474     0.184     0.228     0.265
            sandwich     5e+03       158      0.74     0.325     0.496     0.452
              orange     5e+03       185     0.583     0.205     0.272     0.304
            broccoli     5e+03       330     0.681     0.115     0.358     0.197
              carrot     5e+03       341     0.636     0.108     0.275     0.184
             hot dog     5e+03       160     0.813     0.272     0.511     0.408
               pizza     5e+03       223     0.853     0.489     0.649     0.621
               donut     5e+03       225     0.829     0.462      0.62     0.593
                cake     5e+03       236     0.857     0.356     0.528     0.503
               chair     5e+03  1.59e+03     0.806     0.227     0.428     0.354
               couch     5e+03       236     0.729     0.419     0.603     0.532
        potted plant     5e+03       429     0.831     0.196     0.464     0.317
                 bed     5e+03       195     0.869     0.564     0.745     0.684
        dining table     5e+03       633     0.777     0.298     0.489      0.43
              toilet     5e+03       179     0.922     0.658     0.841     0.768
                  tv     5e+03       257     0.895     0.598     0.778     0.717
              laptop     5e+03       236     0.946     0.523     0.745     0.674
               mouse     5e+03        95     0.884     0.563     0.683     0.688
              remote     5e+03       241     0.771     0.265       0.5     0.395
            keyboard     5e+03       117     0.953     0.518     0.701     0.671
          cell phone     5e+03       291     0.764     0.285     0.389     0.415
           microwave     5e+03        88      0.88     0.585     0.747     0.703
                oven     5e+03       142     0.785     0.317     0.533     0.451
             toaster     5e+03        11         0         0     0.166         0
                sink     5e+03       211     0.834     0.389     0.581      0.53
        refrigerator     5e+03       107     0.905     0.534     0.748     0.672
                book     5e+03  1.03e+03     0.494     0.103     0.171      0.17
               clock     5e+03       290     0.935     0.595     0.731     0.727
                vase     5e+03       350     0.862     0.339     0.528     0.487
            scissors     5e+03        56     0.858     0.232     0.408     0.365
          teddy bear     5e+03       238     0.812     0.454     0.648     0.583
          hair drier     5e+03        11         1    0.0909     0.103     0.167
          toothbrush     5e+03        77     0.829     0.189       0.3     0.308
Speed: 5.8/1.8/7.5 ms inference/NMS/total per 416x416 image at batch-size 32

COCO mAP with pycocotools...
loading annotations into memory...
Done (t=4.30s)
creating index...
index created!
Loading and preparing results...
DONE (t=4.70s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *bbox*
DONE (t=69.35s).
Accumulating evaluation results...
DONE (t=9.05s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.341
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.575
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.353
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.158
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.363
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.504
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.290
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.464
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.510
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.304
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.554
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.681
AlphaGoMK commented 4 years ago

@AlphaGoMK your command should work properly, except that most original training was done with COCO 2014, so you want to use coco2014.data to test, not coco2017.data. Note it will download pjreddie's original *.weights files automatically. When we run this code here everything seems in order.

Note that test.py natively returns mAPs that are slightly below pycocotools, but the difference is generally < 1%. Also note that the best performing yolo model is yolov3-spp-ultralytics.pt, trained from scratch with this repo. See https://github.com/ultralytics/yolov3#map

git clone https://github.com/ultralytics/yolov3
bash yolov3/data/get_coco2014.sh
cd yolov3
python3 test.py --weights yolov3-spp.weights -cfg yolov3-spp.cfg --data coco2014-data --save-json

Namespace(batch_size=32, cfg='yolov3-spp.cfg', conf_thres=0.001, data='data/coco2014.data', device='', img_size=416, iou_thres=0.6, save_json=True, single_cls=False, task='test', weights='yolov3-spp.weights')
Using CUDA device0 _CudaDeviceProperties(name='Tesla P100-PCIE-16GB', total_memory=16280MB)

Downloading https://drive.google.com/uc?export=download&id=16lYS4bcIdM2HdmyJBVDOvt3Trx6N3W2R as yolov3-spp.weights... Done (4.7s)
Caching labels (4954 found, 46 missing, 0 empty, 0 duplicate, for 5000 images): 100% 5000/5000 [00:00<00:00, 10020.79it/s]
               Class    Images   Targets         P         R   mAP@0.5        F1: 100% 157/157 [02:47<00:00,  1.75it/s]
                 all     5e+03  3.51e+04     0.824     0.401     0.566      0.52
              person     5e+03  1.05e+04     0.879     0.558     0.736     0.683
             bicycle     5e+03       313     0.891     0.262      0.47     0.405
                 car     5e+03  1.64e+03     0.835     0.425     0.596     0.564
          motorcycle     5e+03       388     0.936     0.455     0.671     0.612
            airplane     5e+03       131     0.967     0.681     0.851     0.799
                 bus     5e+03       259     0.946     0.676     0.838     0.788
               train     5e+03       212     0.927     0.718      0.83     0.809
               truck     5e+03       352     0.824     0.346     0.579     0.488
                boat     5e+03       458     0.818     0.256     0.477      0.39
       traffic light     5e+03       516     0.786     0.329     0.484     0.464
        fire hydrant     5e+03        83         1     0.751     0.853     0.857
           stop sign     5e+03        84     0.929     0.679     0.788     0.784
       parking meter     5e+03        59     0.825     0.399     0.511     0.538
               bench     5e+03       471     0.814     0.195     0.341     0.314
                bird     5e+03       453     0.774     0.362     0.481     0.493
                 cat     5e+03       195     0.837      0.71     0.783     0.768
                 dog     5e+03       223      0.92     0.668     0.837     0.774
               horse     5e+03       305     0.932     0.633     0.813     0.754
               sheep     5e+03       306     0.887     0.507     0.713     0.645
                 cow     5e+03       376     0.869     0.503     0.678     0.637
            elephant     5e+03       283      0.93     0.801     0.905     0.861
                bear     5e+03        53     0.933     0.698     0.893     0.799
               zebra     5e+03       275     0.947     0.716     0.872     0.816
             giraffe     5e+03       170     0.985     0.792     0.909     0.878
            backpack     5e+03       384     0.795     0.132     0.334     0.226
            umbrella     5e+03       387     0.843     0.359     0.587     0.504
             handbag     5e+03       483     0.666    0.0742     0.206     0.133
                 tie     5e+03       290     0.766     0.327     0.482     0.458
            suitcase     5e+03       309     0.814     0.353     0.576     0.492
             frisbee     5e+03       109     0.945     0.626     0.798     0.753
                skis     5e+03       281      0.76     0.158     0.389     0.262
           snowboard     5e+03        90     0.814     0.367     0.528     0.506
         sports ball     5e+03       233     0.854     0.528     0.622     0.652
                kite     5e+03       381     0.722     0.457     0.563      0.56
        baseball bat     5e+03       123     0.934     0.374     0.581     0.534
      baseball glove     5e+03       139       0.8     0.374     0.466      0.51
          skateboard     5e+03       215     0.909     0.559     0.706     0.692
           surfboard     5e+03       266     0.869     0.375     0.613     0.523
       tennis racket     5e+03       183     0.937      0.57     0.724     0.709
              bottle     5e+03       939     0.737     0.284     0.454      0.41
          wine glass     5e+03       363     0.858      0.32      0.51     0.466
                 cup     5e+03       891     0.822     0.327     0.498     0.468
                fork     5e+03       234      0.87     0.218     0.411     0.349
               knife     5e+03       290     0.656     0.138     0.275     0.228
               spoon     5e+03       253      0.75    0.0949     0.235     0.168
                bowl     5e+03       617     0.747     0.321     0.493     0.449
              banana     5e+03       359     0.814     0.184     0.379       0.3
               apple     5e+03       158     0.474     0.184     0.228     0.265
            sandwich     5e+03       158      0.74     0.325     0.496     0.452
              orange     5e+03       185     0.583     0.205     0.272     0.304
            broccoli     5e+03       330     0.681     0.115     0.358     0.197
              carrot     5e+03       341     0.636     0.108     0.275     0.184
             hot dog     5e+03       160     0.813     0.272     0.511     0.408
               pizza     5e+03       223     0.853     0.489     0.649     0.621
               donut     5e+03       225     0.829     0.462      0.62     0.593
                cake     5e+03       236     0.857     0.356     0.528     0.503
               chair     5e+03  1.59e+03     0.806     0.227     0.428     0.354
               couch     5e+03       236     0.729     0.419     0.603     0.532
        potted plant     5e+03       429     0.831     0.196     0.464     0.317
                 bed     5e+03       195     0.869     0.564     0.745     0.684
        dining table     5e+03       633     0.777     0.298     0.489      0.43
              toilet     5e+03       179     0.922     0.658     0.841     0.768
                  tv     5e+03       257     0.895     0.598     0.778     0.717
              laptop     5e+03       236     0.946     0.523     0.745     0.674
               mouse     5e+03        95     0.884     0.563     0.683     0.688
              remote     5e+03       241     0.771     0.265       0.5     0.395
            keyboard     5e+03       117     0.953     0.518     0.701     0.671
          cell phone     5e+03       291     0.764     0.285     0.389     0.415
           microwave     5e+03        88      0.88     0.585     0.747     0.703
                oven     5e+03       142     0.785     0.317     0.533     0.451
             toaster     5e+03        11         0         0     0.166         0
                sink     5e+03       211     0.834     0.389     0.581      0.53
        refrigerator     5e+03       107     0.905     0.534     0.748     0.672
                book     5e+03  1.03e+03     0.494     0.103     0.171      0.17
               clock     5e+03       290     0.935     0.595     0.731     0.727
                vase     5e+03       350     0.862     0.339     0.528     0.487
            scissors     5e+03        56     0.858     0.232     0.408     0.365
          teddy bear     5e+03       238     0.812     0.454     0.648     0.583
          hair drier     5e+03        11         1    0.0909     0.103     0.167
          toothbrush     5e+03        77     0.829     0.189       0.3     0.308
Speed: 5.8/1.8/7.5 ms inference/NMS/total per 416x416 image at batch-size 32

COCO mAP with pycocotools...
loading annotations into memory...
Done (t=4.30s)
creating index...
index created!
Loading and preparing results...
DONE (t=4.70s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *bbox*
DONE (t=69.35s).
Accumulating evaluation results...
DONE (t=9.05s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.341
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.575
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.353
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.158
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.363
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.504
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.290
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.464
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.510
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.304
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.554
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.681

Thanks for your prompt reply 😊
Follow your advice, I've checked my validation data and found mistakes in val2017.shapes. I use train/val data generated by myself, and forget to substitute original shape file with generated new one. After fixing it, I get expected results. Thanks again : )

glenn-jocher commented 7 months ago

Hello @AlphaGoMK,

I'm glad to hear that you've resolved the issue by correcting the val2017.shapes file. It's great that you double-checked your validation data and found the discrepancy. Data integrity is crucial for accurate model evaluation, so it's always a good practice to verify that all files are correctly aligned with your dataset.

If you have any more questions or run into further issues, feel free to reach out. Happy coding and best of luck with your projects! 😊👍