ultralytics / yolov5

YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
49.68k stars 16.11k forks source link

lower reproduced mAP (31.2 vs 35.5) for yolov5s #103

Closed amsword closed 4 years ago

amsword commented 4 years ago

By following the following cammand, I got 31.2 mAP, but the reported here is 35.5. One thing I found is that by default multi-scale training is not enabled. Is it as expected to reproduce 35.5 mAP? To reproduce 35.5, do I need to add other parameters?

 python train.py --data coco.yaml --cfg yolov5s.yaml --weights '' --batch-size 16      
github-actions[bot] commented 4 years ago

Hello @amsword, thank you for your interest in our work! Please visit our Custom Training Tutorial to get started, and see our Jupyter Notebook Open In Colab, Docker Image, and Google Cloud Quickstart Guide for example environments.

If this is a bug report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom model or data training question, please note that Ultralytics does not provide free personal support. As a leader in vision ML and AI, we do offer professional consulting, from simple expert advice up to delivery of fully customized, end-to-end production solutions for our clients, such as:

For more information please visit https://www.ultralytics.com.

glenn-jocher commented 4 years ago

@amsword no changes are required, and no extra parameters. The only difference with your command is we train with larger batch size 64 for yolov5s.

We also report pycocotools map, which is 1-2% higher than natively computed map.

intgogo commented 4 years ago

@amsword no changes are required, and no extra parameters. The only difference with your command is we train with larger batch size 64 for yolov5s.

We also report pycocotools map, which is 1-2% higher than natively computed map.

I use 4 Titan Xp to train, but I can only set batchsize of 16 in my custom dataset(yolo5m,24 classes).After 108 training iteration(20 hours), I got: 108/299 6.7G 0.03188 0.02646 0.004916 0.06326 110 640 0.6659 0.2483 0.244 0.1677 0.09914 0.04162 0.07194.

Update: I set training parameter rect=True, and the mAP is good now.

glenn-jocher commented 4 years ago

@intgogo haha, I'm not sure. That is pretty strange, typically rect=True will train coco faster, but with a bit lower mAP.

The exact training results are available in the weights folder under /training_results, you can plot them against your current progress (with utils.plot_results()) to compare your results to the official results: https://drive.google.com/drive/u/1/folders/1Drs_Aiu7xx6S-ix95f9kNsA6ueKRpN2J

intgogo commented 4 years ago

@glenn-jocher Something is wrong.I set rect=True from 109 and set rect=False from 123: 108/299 6.7G 0.03188 0.02646 0.004916 0.06326 110 640 0.6659 0.2483 0.244 0.1677 0.09914 0.04162 0.07194 109/299 6.72G 0.09716 0.03988 0.07046 0.2075 59 608 0.6412 0.6926 0.6849 0.4804 0.04562 0.02247 0.01995

122/299 6.72G 0.09467 0.03939 0.0647 0.1988 59 384 0.6412 0.6926 0.6898 0.4823 0.04508 0.02275 0.01944 123/299 6.7G 0.04195 0.03238 0.01123 0.08557 101 608 0.5308 0.2214 0.2233 0.1273 0.07957 0.03333 0.04878

glenn-jocher commented 4 years ago

@intgogo do not ever pause training. Train fully from 0 to 300 epochs.

powermano commented 4 years ago

@intgogo haha, I'm not sure. That is pretty strange, typically rect=True will train coco faster, but with a bit lower mAP.

The exact training results are available in the weights folder under /training_results, you can plot them against your current progress (with utils.plot_results()) to compare your results to the official results: https://drive.google.com/drive/u/1/folders/1Drs_Aiu7xx6S-ix95f9kNsA6ueKRpN2J

The rect=False is the default setting , I want to know whether the results reported in your repo are trained with rect=False. And rect is True when testing, i want to know whether the map will increase a bit when setting rect= Flalse. Thanks.

intgogo commented 4 years ago

@powermano Try: rect=False,multi-scale=False,mosaic=False(edit dataset.py line 288), and use hyp parameters. I finally got the same or better result as darknet.

Libaishun commented 4 years ago

@intgogo could you share your train results? I train yolov5s with the same settings as yours, to 73 epochs, only get AP@0.5 0.33, AP@0.5:0.95 0.16. compare to mosaic=True at same epoch which can reach AP@0.5 0.46, AP@0.5:0.95 0.27, my results is much lower. you said you "finally got the same or better result as darknet", which model do you mean and how many epochs you use

intgogo commented 4 years ago

@intgogo sorry,I was wrong,finally,I tried yolov3-spp using parameters above and single gpu,better results than darknet in 608 and multi-scale:

     0/299     9.81G   0.06282   0.02756   0.04266     0.133        28       640    0.3046    0.7106    0.5845    0.2295   0.04898    0.0207    0.0151
     1/299     9.81G    0.0483   0.02063   0.01387    0.0828        28       640    0.3896    0.8323     0.717    0.3053   0.04352   0.02002   0.01006
     2/299     9.81G   0.04648   0.02034   0.01144   0.07827        28       640    0.4932    0.8554    0.7882    0.3556   0.04203   0.02014  0.009998
     3/299     9.81G    0.0438   0.01994   0.01022   0.07396        27       640    0.5033    0.8842    0.7987       0.4   0.03971   0.01958   0.00762
     4/299     9.81G   0.04076   0.01937  0.008704   0.06884        18       640     0.535     0.916    0.8494    0.4387   0.03681   0.01902  0.006739
     5/299     9.81G   0.03908   0.01872   0.00775   0.06556        32       640    0.5477    0.9007    0.8336    0.4489   0.03612   0.01861  0.006369
     6/299     9.81G    0.0381   0.01851  0.007406   0.06401        29       640     0.574    0.9139    0.8627    0.4706   0.03572   0.01835  0.006233
     7/299     10.7G   0.03684   0.01808  0.007051   0.06196       102       640    0.5619    0.9123    0.8587    0.4607    0.0366    0.0186  0.006763
     8/299     10.7G   0.03616   0.01786  0.006835   0.06085       100       640    0.5567    0.9141    0.8577     0.467   0.03609   0.01837  0.006252
     9/299     10.7G   0.03551   0.01755  0.006406   0.05947       137       640    0.5619    0.9135    0.8546    0.4642   0.03593   0.01829  0.006483
    10/299     10.7G   0.03486   0.01739  0.006226   0.05847       114       640    0.5789     0.912    0.8575    0.4642   0.03601   0.01828  0.006306
    11/299     10.7G   0.03447   0.01719  0.005949   0.05762       119       640    0.5742     0.917    0.8584     0.471   0.03566   0.01825  0.006203
    12/299     10.7G   0.03428   0.01716  0.005906   0.05735       119       640    0.5643    0.9154    0.8648    0.4774   0.03542   0.01827  0.006175
    13/299     10.7G   0.03392   0.01711   0.00581   0.05684       114       640    0.5831    0.9048    0.8589    0.4673   0.03575   0.01831  0.006267
    14/299     10.7G   0.03375   0.01714  0.005739   0.05663        95       640    0.5673    0.9115    0.8557    0.4784   0.03563   0.01825  0.006304
    15/299     10.7G   0.03356   0.01703  0.005602   0.05619       130       640    0.5794    0.9152    0.8626    0.4813   0.03541   0.01821  0.006246
    16/299     10.7G   0.03319   0.01696  0.005546   0.05569        89       640    0.5816    0.9196    0.8645    0.4832   0.03521   0.01813  0.006204
    17/299     10.7G   0.03307   0.01672  0.005465   0.05526       100       640    0.5868     0.918    0.8675    0.4907   0.03501   0.01814  0.006038
    18/299     10.7G   0.03271   0.01676  0.005546   0.05502       151       640    0.5764    0.9208    0.8683    0.4827   0.03502   0.01818  0.006173
    19/299     10.7G   0.03267   0.01673   0.00534   0.05474       147       640    0.5781    0.9199    0.8672    0.4892   0.03496   0.01812  0.006185
    20/299     10.7G   0.03267   0.01666  0.005281   0.05461       119       640    0.5841     0.918    0.8674    0.4825   0.03518   0.01821  0.006044
    21/299     10.7G   0.03243   0.01669  0.005323   0.05444       113       640    0.5859    0.9164    0.8685    0.4866   0.03509   0.01815  0.006013
    22/299      9.8G   0.03238    0.0165  0.005183   0.05407        28       640    0.6055    0.9337    0.8891     0.517   0.03302   0.01767  0.005537
    23/299      9.8G   0.03218   0.01646   0.00515   0.05378        28       640    0.6166    0.9287    0.8893     0.505   0.03356    0.0178  0.005159
    24/299      9.8G   0.03186   0.01643  0.004973   0.05327        28       640    0.6098    0.9258    0.8972    0.5376   0.03271   0.01749  0.004944
    25/299      9.8G    0.0319   0.01619  0.004945   0.05303        27       640    0.6132    0.9178    0.8903    0.5178   0.03256   0.01758  0.005208
    26/299      9.8G   0.03163   0.01622  0.004942   0.05279        18       640    0.6255    0.9222    0.8914    0.5074   0.03303   0.01806  0.005219
    27/299      9.8G   0.03148   0.01611  0.004759   0.05234        32       640    0.6562    0.9071    0.8837    0.5297   0.03203   0.01762  0.005289
    28/299      9.8G   0.03163   0.01611  0.004873   0.05262        29       640    0.6477     0.921    0.8915    0.5327   0.03269   0.01745  0.005462
    29/299      9.8G   0.03169   0.01645  0.005013   0.05316        33       640    0.6221    0.9123    0.8752    0.4918   0.03372    0.0181  0.005292
    30/299      9.8G   0.03191   0.01655  0.005182   0.05364        26       640    0.6438    0.9249    0.8835    0.5273   0.03177   0.01745  0.005452
    31/299      9.8G   0.03176   0.01647  0.005205   0.05344        34       640    0.6535     0.922    0.8971    0.5451   0.03158    0.0174   0.00499
    32/299      9.8G   0.03191   0.01649   0.00509   0.05349        20       640    0.6312     0.927    0.8969    0.5348   0.03216   0.01751  0.004929
    33/299      9.8G   0.03152   0.01634  0.004906   0.05277        24       640    0.6099    0.9247    0.8972    0.5486   0.03182   0.01734  0.004919
    34/299      9.8G   0.03161   0.01632  0.004937   0.05287        19       640    0.6354    0.9358    0.9094    0.5601   0.03115   0.01725   0.00479
    35/299      9.8G   0.03148    0.0162  0.004898   0.05257        29       640    0.6602    0.9303     0.895    0.5521   0.03114   0.01707  0.005032
    36/299      9.8G    0.0313   0.01621   0.00487   0.05238        37       640    0.6489    0.9249    0.8937    0.5401   0.03118   0.01736  0.004889
    37/299      9.8G   0.03126   0.01608  0.004891   0.05223        29       640    0.6429    0.9274    0.8947    0.5512    0.0308   0.01701  0.004469
    38/299      9.8G   0.03114   0.01614   0.00481   0.05209        15       640    0.6716    0.9369    0.9109    0.5633   0.03109   0.01706   0.00452
    39/299      9.8G   0.03101   0.01595  0.004639    0.0516        26       640    0.6631    0.9342    0.9099    0.5647   0.03047   0.01698  0.004473
    40/299      9.8G   0.03099   0.01599  0.004715    0.0517        34       640     0.652    0.9319    0.9059    0.5675   0.03024   0.01702  0.004735
    41/299      9.8G   0.03095   0.01602  0.004646   0.05162        13       640    0.6798    0.9323    0.9126    0.5695   0.03032   0.01701  0.004522
    42/299      9.8G    0.0309     0.016  0.004748   0.05164        26       640    0.6957    0.9326    0.9078    0.5654   0.03039    0.0168  0.004592
    43/299      9.8G   0.03091   0.01598  0.004765   0.05165        28       640    0.6756    0.9311     0.911    0.5781   0.03018   0.01682   0.00458
    44/299      9.8G   0.03097     0.016  0.004782   0.05174        28       640    0.6698    0.9315    0.9105    0.5738   0.03013     0.017  0.004489
    45/299      9.8G    0.0308   0.01608  0.004678   0.05155        31       640    0.6785    0.9363    0.9095    0.5748   0.02997   0.01689  0.004611
    46/299      9.8G   0.03078   0.01598  0.004566   0.05133        19       640    0.6593    0.9437    0.9182    0.5853   0.02959   0.01659   0.00454
    47/299      9.8G   0.03057   0.01588  0.004489   0.05094        20       640    0.6651    0.9339    0.9116      0.58   0.02943    0.0167  0.004239
    48/299      9.8G   0.03066   0.01589   0.00475   0.05131        25       640    0.6836    0.9389    0.9144    0.5902   0.02946   0.01675  0.004729
    49/299      9.8G   0.03055   0.01583  0.004612   0.05099        26       640    0.7035    0.9411    0.9141    0.5882   0.02945   0.01644  0.004387
    50/299      9.8G   0.03068   0.01594  0.004765   0.05139        19       640    0.6889    0.9417    0.9179    0.5921   0.02934   0.01644  0.004262
    51/299      9.8G   0.03042   0.01573  0.004524   0.05067        25       640    0.6872    0.9359    0.9161    0.5927   0.02902   0.01633  0.004238
    52/299      9.8G   0.03067   0.01591  0.004688   0.05127        45       640    0.7042    0.9576    0.9309    0.5996   0.02896   0.01627  0.004249
    53/299      9.8G   0.03033   0.01576  0.004537   0.05063        24       640    0.6904    0.9378    0.9145    0.6002   0.02883   0.01646  0.004222
    54/299      9.8G   0.03022   0.01571  0.004435   0.05036        18       640     0.685    0.9352    0.9135    0.6047   0.02868   0.01632  0.004212
    55/299      9.8G   0.03024   0.01573  0.004541   0.05051        22       640    0.6906    0.9389    0.9208    0.6005   0.02877   0.01637  0.004187
    56/299      9.8G    0.0302   0.01574   0.00446    0.0504        29       640    0.6822     0.942    0.9196    0.6029   0.02853   0.01634  0.004192
    57/299      9.8G   0.03028   0.01577  0.004532   0.05059        28       640    0.6853    0.9422    0.9191    0.6075   0.02846   0.01632  0.004288
    58/299      9.8G   0.03024   0.01569  0.004473    0.0504        24       640    0.6908    0.9432     0.924    0.6098   0.02842   0.01621  0.004168
    59/299      9.8G   0.03046    0.0158  0.004708   0.05097        20       640    0.6883    0.9442     0.926    0.6031   0.02847   0.01617  0.004111
    60/299      9.8G   0.03018    0.0158  0.004526   0.05051        34       640    0.6957    0.9433    0.9237    0.6055   0.02835   0.01606  0.004121
    61/299      9.8G   0.03011   0.01569  0.004508    0.0503        26       640    0.6881    0.9375    0.9197    0.6031   0.02829   0.01616   0.00403
    62/299      9.8G   0.03009   0.01576  0.004559   0.05041        28       640    0.6891    0.9437    0.9233    0.6138   0.02822    0.0161  0.004051
    63/299      9.8G   0.02978   0.01544  0.004258   0.04948        27       640    0.6963    0.9474    0.9273    0.6159     0.028   0.01608  0.004026
    64/299      9.8G   0.02972   0.01543  0.004417   0.04956        17       640    0.6837    0.9449    0.9229    0.6187   0.02791   0.01601  0.003981
    65/299      9.8G   0.02987   0.01554  0.004439   0.04985        20       640    0.6904    0.9513    0.9244    0.6176   0.02789   0.01602  0.004019
    66/299      9.8G   0.02964   0.01553    0.0043   0.04948        44       640    0.6909     0.944    0.9207     0.616   0.02803   0.01612   0.00392
    67/299      9.8G   0.02979   0.01551  0.004448   0.04975        23       640    0.6941    0.9402    0.9206    0.6187   0.02791   0.01607  0.003838
    68/299      9.8G   0.02966   0.01554  0.004263   0.04946        41       640    0.7025    0.9442    0.9226    0.6162    0.0278   0.01608  0.003891
    69/299      9.8G   0.02967   0.01544   0.00441   0.04952        31       640    0.7006    0.9441    0.9227    0.6187   0.02773   0.01603  0.003848
    70/299      9.8G   0.02976   0.01535  0.004315   0.04942        23       640    0.7024    0.9456    0.9237    0.6224   0.02761     0.016  0.003798
    71/299      9.8G   0.02981   0.01542  0.004393   0.04961        25       640    0.6982     0.943    0.9223    0.6201   0.02767   0.01601  0.003764
    72/299      9.8G   0.02952   0.01539  0.004196    0.0491        28       640    0.6942    0.9444    0.9217    0.6214   0.02767   0.01598  0.003738
    73/299      9.8G   0.02944   0.01531  0.004157   0.04891        23       640    0.6928    0.9437    0.9251    0.6246   0.02762   0.01597  0.003709
    74/299      9.8G   0.02946   0.01528  0.004175   0.04891        25       640    0.6951    0.9461    0.9271    0.6272    0.0275   0.01594  0.003717
    75/299      9.8G   0.02956   0.01528  0.004182   0.04903        29       640    0.6946    0.9465    0.9261    0.6247    0.0275   0.01593  0.003726
    76/299      9.8G   0.02957    0.0155   0.00442   0.04949        44       640    0.6942    0.9472    0.9248    0.6242    0.0275   0.01593   0.00371
    77/299      9.8G    0.0295   0.01541   0.00421   0.04911        34       640    0.6947    0.9476    0.9241     0.621   0.02749   0.01593  0.003708
    78/299      9.8G   0.02932   0.01523  0.004174   0.04872        31       640    0.6946    0.9473    0.9208    0.6205   0.02748   0.01591  0.003722
    79/299      9.8G   0.02943   0.01533  0.004298   0.04906        24       640    0.6985    0.9473    0.9209     0.621   0.02746   0.01589  0.003714
    80/299      9.8G   0.02938   0.01536  0.004203   0.04894        39       640    0.6991    0.9472    0.9261    0.6235    0.0275   0.01589  0.003691
    81/299      9.8G   0.02939   0.01529   0.00417   0.04885        23       640    0.6989    0.9472    0.9252    0.6236   0.02746    0.0159  0.003693
    82/299      9.8G   0.02908   0.01506  0.004064    0.0482        29       640    0.6965    0.9473    0.9266    0.6234   0.02743    0.0159  0.003696
    83/299      9.8G   0.02928   0.01526    0.0043   0.04884        15       640    0.6931    0.9466    0.9249     0.623    0.0274   0.01591  0.003681
    84/299      9.8G    0.0291   0.01525  0.003979   0.04833        34       640    0.6922    0.9464    0.9226    0.6214    0.0274    0.0159  0.003688
    85/299      9.8G   0.02909   0.01508  0.004147   0.04832        25       640     0.691    0.9458    0.9242    0.6237   0.02737   0.01589  0.003677
    86/299      9.8G   0.02893   0.01498   0.00406   0.04797        28       640    0.6885    0.9457    0.9238    0.6223   0.02737   0.01589   0.00367
    87/299      9.8G   0.02929   0.01528  0.004189   0.04876        10       640    0.6876    0.9455    0.9233    0.6242   0.02731   0.01588  0.003655
    88/299      9.8G   0.02916   0.01529  0.004285   0.04873        27       640    0.6874    0.9459    0.9234     0.625   0.02731   0.01589  0.003654
    89/299      9.8G   0.02908   0.01515  0.004132   0.04836        24       640    0.6864    0.9459    0.9212    0.6245    0.0273   0.01589  0.003644
    90/299      9.8G   0.02893   0.01501  0.004002   0.04795        42       640    0.6867    0.9458    0.9208    0.6256    0.0273   0.01589  0.003638
    91/299      9.8G   0.02889   0.01497  0.004061   0.04792        29       640    0.6864    0.9459    0.9209    0.6254   0.02728    0.0159  0.003623
    92/299      9.8G   0.02871   0.01497  0.003928   0.04762        46       640    0.6861     0.946    0.9209    0.6261   0.02727   0.01589  0.003621
    93/299      9.8G   0.02857   0.01486  0.003973   0.04741        41       640    0.6866    0.9463     0.921    0.6257   0.02726    0.0159  0.003621
    94/299      9.8G   0.02866   0.01506  0.003947   0.04767        22       640    0.6866    0.9463    0.9232    0.6279   0.02726   0.01591  0.003619
    95/299      9.8G   0.02875   0.01505  0.004114   0.04792        27       640    0.6861    0.9461     0.923     0.627   0.02726    0.0159  0.003613
    96/299      9.8G   0.02855    0.0149  0.003894   0.04734        13       640    0.6859    0.9462    0.9234    0.6279   0.02726    0.0159  0.003614
    97/299      9.8G   0.02848   0.01484  0.003871    0.0472        20       640     0.686    0.9472    0.9266    0.6293   0.02726    0.0159  0.003613
    98/299      9.8G   0.02842   0.01479  0.003824   0.04703        36       640    0.6859    0.9474    0.9266    0.6296   0.02726   0.01591  0.003619
    99/299      9.8G   0.02839   0.01481  0.003934   0.04714        35       640    0.6856    0.9475    0.9262    0.6294   0.02725   0.01591  0.003619
   100/299      9.8G   0.02853   0.01491   0.00396   0.04739        32       640    0.6856    0.9466    0.9261    0.6291   0.02724   0.01592  0.003609
   101/299      9.8G   0.02851   0.01483  0.003898   0.04724        41       640    0.6845    0.9469    0.9255    0.6295   0.02724   0.01593   0.00361
   102/299      9.8G   0.02849   0.01485  0.004074   0.04742        34       640    0.6854    0.9463    0.9252    0.6296   0.02724   0.01593  0.003599
   103/299      9.8G   0.02819   0.01469  0.003753   0.04663        19       640     0.685    0.9464    0.9252    0.6298   0.02723   0.01593  0.003597
   104/299      9.8G   0.02832   0.01472  0.003832   0.04687        16       640    0.6858    0.9472    0.9258    0.6297   0.02723   0.01594   0.00359
   105/299      9.8G    0.0285   0.01504  0.004045   0.04759        29       640    0.6877    0.9475    0.9257    0.6302   0.02719   0.01593  0.003572
   106/299      9.8G   0.02852   0.01497  0.003965   0.04746        16       640     0.688    0.9476    0.9257    0.6294   0.02721   0.01593  0.003575
   107/299      9.8G    0.0284   0.01493  0.003928   0.04726        32       640     0.688    0.9475     0.926    0.6305   0.02722   0.01594  0.003579
   108/299      9.8G   0.02812   0.01472  0.003855   0.04669        22       640    0.6877    0.9482    0.9258    0.6309   0.02724   0.01596   0.00358
   109/299      9.8G     0.028    0.0146  0.003748   0.04635        19       640     0.688    0.9483    0.9255    0.6302   0.02722   0.01596  0.003576
   110/299      9.8G   0.02805   0.01457  0.003769   0.04638        23       640    0.6875     0.949    0.9257    0.6305   0.02723   0.01596  0.003573
   111/299      9.8G   0.02801   0.01461  0.003751   0.04637        25       640    0.6871     0.949    0.9258    0.6312   0.02723   0.01596   0.00357
   112/299      9.8G   0.02813   0.01459  0.003814   0.04653        28       640     0.687    0.9487    0.9258    0.6312   0.02724   0.01597  0.003584
   113/299      9.8G   0.02796   0.01455  0.003725   0.04624        27       640    0.6864    0.9483    0.9261     0.631    0.0272   0.01597  0.003563
   114/299      9.8G   0.02786   0.01447  0.003731   0.04606        39       640     0.686    0.9483    0.9256    0.6306   0.02723   0.01597   0.00358
   115/299      9.8G   0.02793   0.01456  0.003835   0.04633        23       640    0.6851    0.9483    0.9259    0.6304   0.02724   0.01597  0.003579
   116/299      9.8G   0.02778   0.01462  0.003691   0.04609        25       640    0.6858    0.9482    0.9255     0.632    0.0272   0.01597  0.003565
   117/299      9.8G   0.02779   0.01448  0.003724     0.046        23       640    0.6868    0.9486    0.9259    0.6315    0.0272   0.01597  0.003567
   118/299      9.8G   0.02766   0.01444  0.003847   0.04595        38       640    0.6859    0.9485    0.9261    0.6329    0.0272   0.01597  0.003559
   119/299      9.8G   0.02755   0.01438  0.003684   0.04562        19       640    0.6861    0.9485    0.9261    0.6326    0.0272   0.01596  0.003562
   120/299      9.8G    0.0276   0.01435  0.003636   0.04559        33       640    0.6865    0.9485    0.9259    0.6323    0.0272   0.01597  0.003568
   121/299      9.8G    0.0276   0.01438  0.003694   0.04567        31       640    0.6863    0.9486    0.9259    0.6321    0.0272   0.01597  0.003572
   122/299      9.8G   0.02762   0.01436  0.003728   0.04571        25       640     0.689    0.9487    0.9258    0.6312   0.02721   0.01597  0.003577
   123/299      9.8G   0.02758    0.0145  0.003761   0.04584        14       640      0.69    0.9485    0.9258     0.631   0.02718   0.01595  0.003571
   124/299      9.8G   0.02731   0.01438   0.00357   0.04526        22       640    0.6914    0.9486    0.9259    0.6311   0.02718   0.01595  0.003573
   125/299      9.8G   0.02757   0.01445  0.003771   0.04579        30       640    0.6922    0.9484    0.9258    0.6315   0.02716   0.01594  0.003576
   126/299      9.8G   0.02739   0.01439   0.00358   0.04535        23       640    0.6931    0.9485    0.9258     0.632   0.02714   0.01596  0.003571
   127/299      9.8G   0.02734   0.01423  0.003615   0.04518        24       640    0.6941    0.9486    0.9262    0.6322   0.02718   0.01596  0.003584
   128/299      9.8G   0.02761   0.01448  0.003661   0.04575        27       640    0.6933    0.9487    0.9262    0.6323   0.02715   0.01595  0.003572
   129/299      9.8G   0.02748   0.01454  0.003775   0.04579        25       640    0.6942    0.9487    0.9265    0.6324   0.02717   0.01595  0.003577
   130/299      9.8G   0.02727   0.01428  0.003594   0.04514        22       640    0.6946    0.9487    0.9263    0.6316   0.02716   0.01596  0.003575
   131/299      9.8G   0.02725   0.01426  0.003626   0.04513        31       640    0.6954    0.9487    0.9263    0.6321   0.02717   0.01596  0.003589
   132/299      9.8G   0.02712   0.01419  0.003584   0.04489        15       640    0.6957    0.9486    0.9266    0.6323   0.02717   0.01596  0.003586
   133/299      9.8G   0.02723   0.01423  0.003652   0.04511        23       640     0.696    0.9481    0.9262    0.6325   0.02715   0.01596  0.003586
   134/299      9.8G   0.02703   0.01415  0.003405   0.04459        26       640    0.6968    0.9482    0.9258    0.6329   0.02715   0.01596  0.003591
   135/299      9.8G    0.0269   0.01404  0.003452   0.04439        36       640    0.6969    0.9481    0.9257    0.6335   0.02715   0.01597  0.003595
   136/299      9.8G   0.02697   0.01412   0.00351    0.0446        39       640    0.6969    0.9481    0.9255    0.6331   0.02716   0.01597  0.003605
   137/299      9.8G   0.02675   0.01394  0.003407   0.04409        26       640    0.6963    0.9488    0.9258    0.6337   0.02716   0.01597  0.003607
   138/299      9.8G   0.02686   0.01397  0.003473    0.0443        29       640     0.696    0.9486    0.9258    0.6339   0.02716   0.01597  0.003609
   139/299      9.8G   0.02668   0.01395  0.003364   0.04399        24       640    0.6966    0.9489    0.9259    0.6333   0.02717   0.01597  0.003607
   140/299      9.8G   0.02677   0.01394  0.003529   0.04424        35       640     0.697    0.9489    0.9258    0.6333   0.02716   0.01597  0.003608
   141/299      9.8G   0.02676   0.01403  0.003522   0.04431        29       640    0.6949    0.9485     0.926    0.6348   0.02715   0.01598  0.003604
   142/299      9.8G   0.02655     0.014  0.003396   0.04395        20       640    0.6949    0.9486    0.9262    0.6349   0.02714   0.01599  0.003604
   143/299      9.8G   0.02672   0.01403  0.003472   0.04422        35       640    0.6954    0.9486    0.9261    0.6351   0.02717   0.01599  0.003616
   144/299      9.8G   0.02656   0.01394  0.003347   0.04384        34       640    0.6965    0.9488    0.9263    0.6342   0.02717     0.016   0.00361
   145/299      9.8G   0.02653   0.01387  0.003391   0.04378        32       640    0.6962    0.9488    0.9264    0.6349   0.02717     0.016  0.003605
   146/299      9.8G   0.02634   0.01375   0.00325   0.04334        21       640    0.6961    0.9486    0.9262    0.6355   0.02716   0.01599    0.0036
   147/299      9.8G   0.02629   0.01377  0.003296   0.04335        28       640    0.6977    0.9485    0.9261    0.6357   0.02716   0.01599  0.003598
   148/299      9.8G    0.0263   0.01374  0.003265    0.0433        25       640    0.6979    0.9485    0.9259    0.6359   0.02716     0.016  0.003594
   149/299      9.8G   0.02629   0.01375  0.003363    0.0434        35       640    0.6987    0.9484    0.9259    0.6364   0.02717     0.016  0.003589
   150/299      9.8G    0.0263   0.01373  0.003285   0.04332        25       640    0.7033    0.9478    0.9253    0.6391   0.02717     0.016  0.003585
   151/299      9.8G   0.02619   0.01369  0.003278   0.04315        34       640    0.7042    0.9479     0.925    0.6399   0.02716     0.016  0.003584
   152/299      9.8G   0.02611   0.01373  0.003313   0.04316        37       640    0.7045     0.948    0.9252    0.6391   0.02716     0.016   0.00358
   153/299      9.8G   0.02616   0.01371   0.00329   0.04316        24       640    0.7049    0.9479     0.925    0.6391   0.02716   0.01601  0.003582
   154/299      9.8G   0.02603   0.01363  0.003285   0.04294        35       640    0.7055    0.9479     0.925    0.6398   0.02716   0.01601   0.00358
   155/299      9.8G     0.026    0.0136  0.003255   0.04286        34       640     0.706    0.9477    0.9251    0.6405   0.02715     0.016  0.003579
   156/299      9.8G   0.02594   0.01357  0.003168   0.04268        37       640    0.7057    0.9477    0.9253    0.6406   0.02715   0.01601  0.003576
   157/299      9.8G   0.02588   0.01361  0.003245   0.04273        36       640    0.7068    0.9477    0.9252    0.6403   0.02715   0.01601  0.003575
   158/299      9.8G   0.02585   0.01355  0.003141   0.04254        26       640    0.7074    0.9477    0.9254    0.6402   0.02715     0.016  0.003577
   159/299      9.8G   0.02571   0.01349  0.003132   0.04233        22       640    0.7077    0.9476    0.9255    0.6399   0.02715   0.01601  0.003578
   160/299      9.8G   0.02579   0.01352  0.003206   0.04251        27       640    0.7075    0.9476    0.9255    0.6408   0.02715   0.01601  0.003576

From epoch 7-10,I changed to use 4 gpus and from 11,I changed back to one gpu training. I will try yolov5 soon.

intgogo commented 4 years ago

@Libaishun I trained yolov5m for 9 epoch, AP@0.5 0.85, AP@0.5:0.95 0.44,results:

  0/299     4.86G   0.06559   0.02788   0.04426    0.1377        30       640    0.2839    0.7074      0.55    0.2356    0.0499   0.02246   0.01646
     1/299     4.86G   0.05105   0.02221   0.01541   0.08867        31       640    0.4267    0.7913    0.6602    0.2596   0.04803   0.02154   0.01069
     2/299     4.86G   0.05054   0.02244   0.01324   0.08622        28       640     0.455    0.7625    0.6559    0.2863   0.04914   0.02254   0.01125
     3/299     4.86G   0.04858   0.02253   0.01209   0.08319        26       640    0.4855    0.8249     0.749    0.3341   0.04401   0.02155  0.008853
     4/299     4.86G     0.045   0.02149    0.0103   0.07679        30       640    0.4674    0.8473    0.7699    0.3523   0.04455   0.02078  0.007824
     5/299     4.86G   0.04276   0.02054  0.008906   0.07221        26       640    0.5602    0.8676    0.8227    0.3943   0.04095   0.01991  0.007223
     6/299     4.86G   0.04127   0.02025  0.008276    0.0698        56       640    0.5708    0.8846    0.8426    0.4223   0.04052   0.01966  0.006458
     7/299     4.86G   0.04032   0.01989  0.007948   0.06816        44       640    0.5592    0.9042    0.8514    0.4491   0.03802   0.01934  0.006216
     8/299     4.86G   0.03958   0.01956  0.007467   0.06661        39       640      0.58    0.8932    0.8563    0.4487   0.03756   0.01938  0.005977

Conclusion: 1.Set input image size: [640, 640] 2.Use 1 gpu

Libaishun commented 4 years ago

@intgogo Thanks for your reply, yet it seems you were not training on coco dataset or training on coco but not from scratch. I'm curious about the result on coco. @glenn-jocher said train with mosiac=False just result in a bit lower mAP, but as far as in my experiments, it seems result in much lower mAP. I'm not prefer to mosaic augmentation because it is not a general useful trick in many of my experiments, it hurts the datasets in many of my cases. Now my training goes to epoch 85, the mAP still stucks at AP@0.5 0.34, AP@0.5:0.95 0.17, I'll wait to finish 300 epochs to see the final results.

github-actions[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.