wang-xinyu / tensorrtx

Implementation of popular deep learning networks with TensorRT network definition API
MIT License
6.84k stars 1.75k forks source link

yolov4 result is very poor #921

Closed tuteming closed 2 years ago

tuteming commented 2 years ago

Env

About this repo

Your problem

/////////////////////////////// yolov4 result is very poor when compare the original yolov4 [AlexeyAB], the yolov4 results is very poor.(by custom data). when the trained weight can obtain 99.96% in original yolov4. in yolov4 results only about 70%. //////////////////////////////////////

wang-xinyu commented 2 years ago

maybe need to check the anchors.

tuteming commented 2 years ago

我仔細檢查整個過程:(該改的我之前都改了)image size=800x800, net size=800x800, classess=6. yolov4.cfg 和我的一樣,除了net size=608x608改成800x800 主要差異在於你的yolov4.cpp define each layer. Line244-471 中很多層的padding=0, 在yolov4.cfg 他們的padding=1, 我修正yolov4.cpp後, run yolov4.exe -s 出現 Loading weights: ../yolov4.wts [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). Building tensorrt engine, please wait for a while... [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]). [03/02/2022-14:40:10] [E] [TRT] Could not compute dimensions for (Unnamed Layer 18) [ElementWise]_output, because the network is not valid. [03/02/2022-14:40:10] [E] [TRT] Network validation failed. Build engine successfully!

run yolov4.exe -d samples [03/02/2022-15:11:16] [E] [TRT] Parameter check failed at: runtime.cpp::nvinfer1::Runtime::deserializeCudaEngine::30, condition: (blob) != nullptr

我檢查darknet, run as 0 conv 32 3 x 3/ 1 800 x 800 x 3 -> 800 x 800 x 32 1.106 BF 1 conv 64 3 x 3/ 2 800 x 800 x 32 -> 400 x 400 x 64 5.898 BF 2 conv 64 1 x 1/ 1 400 x 400 x 64 -> 400 x 400 x 64 1.311 BF 3 route 1 -> 400 x 400 x 64 4 conv 64 1 x 1/ 1 400 x 400 x 64 -> 400 x 400 x 64 1.311 BF 5 conv 32 1 x 1/ 1 400 x 400 x 64 -> 400 x 400 x 32 0.655 BF 6 conv 64 3 x 3/ 1 400 x 400 x 32 -> 400 x 400 x 64 5.898 BF 7 Shortcut Layer: 4, wt = 0, wn = 0, outputs: 400 x 400 x 64 0.010 BF 8 conv 64 1 x 1/ 1 400 x 400 x 64 -> 400 x 400 x 64 1.311 BF 9 route 8 2 -> 400 x 400 x 128 10 conv 64 1 x 1/ 1 400 x 400 x 128 -> 400 x 400 x 64 2.621 BF 11 conv 128 3 x 3/ 2 400 x 400 x 64 -> 200 x 200 x 128 5.898 BF

is 400x400, neither 404x404 nor 402x402 請教一下,謝謝

tuteming commented 2 years ago

yolov4.cpp padding=0, 與 yolov4.cfg padding=1,的定義已經不一樣 為何產生的yolov4.engine可以跑? 只是正確率很差. 改過的yolov4.cpp padding=1, 與 yolov4.cfg padding=1,的定義一樣 產生的yolov4.engine反而出現上面的錯誤.

請賜教 再次感謝

wang-xinyu commented 2 years ago

which yolov4.cfg are you checking? can you share the link?

tuteming commented 2 years ago

a -version : https://github.com/AlexeyAB/darknet/blob/master/build/darknet/x64/cfg/yolov4.cfg u -version : https://github.com/ultralytics/yolov3/blob/archive/cfg/yolov4.cfg 2個版本網路結構都一樣差在a -version的3個[yolo]的最後多個max_delta=5,這只跟訓練有關,無關測試 python gen_wts.py yolov4.weights時,須將max_delta=5 #掉,因為不支援. 無論任一版本都是padding=1(重頭到尾), 而yolov4.cpp 部分 padding=0 , 這是我奇怪的地方. 同時, tensorrtx/scaled-yolov4/ yolov4_csp.cpp 與https://github.com/WongKinYiu/ScaledYOLOv4/tree/yolov4-csp 也是一樣padding=0的情形. yolo結構我研究了3年,padding size 不一樣就大有問題

由於tensort我初接觸, 請賜教 再次感謝

tuteming commented 2 years ago

makaveli10 gitub 我已檢查過,跟你一樣,也是padding=0

wang-xinyu commented 2 years ago

padding=0 in .cfg doesn't mean that the conv layer has padding=0,

can you check the conv layer

tuteming commented 2 years ago

"padding=0 in .cfg doesn't mean that the conv layer has padding=0," =>yolov4.cpp 的line 242 // define each layer. 不是要根據yolov4.cfg 每一層來定義嗎? "can you check the conv layer" 請問the conv layer指的是yolov4.cpp? 我上述的錯誤中 Loading weights: ../yolov4.wts [03/02/2022-14:40:10] [E] [TRT] (Unnamed Layer* 18) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [64,404,404] and [64,402,402]).

(Unnamed Layer* 18指的是哪一層(line num )? 64,404,404] and [64,402,402]).顯然就是route那一層發現維度不等

再次打擾你,真是抱歉.請再次賜教 感謝

wang-xinyu commented 2 years ago

You can use this repo https://github.com/ultralytics/yolov3/blob/archive/cfg/yolov4.cfg to load pytorch model and print models, then check each conv layer

tuteming commented 2 years ago

這是結果,除了size=608外,其他都一樣
無論任一版本都是padding=1(重頭到尾), 而yolov4.cpp 部分 padding=0 , 這是我奇怪的地方. 0 conv 32 3 x 3/ 1 608 x 608 x 3 -> 608 x 608 x 32 0.639 BF 1 conv 64 3 x 3/ 2 608 x 608 x 32 -> 304 x 304 x 64 3.407 BF 2 conv 64 1 x 1/ 1 304 x 304 x 64 -> 304 x 304 x 64 0.757 BF 3 route 1 -> 304 x 304 x 64 4 conv 64 1 x 1/ 1 304 x 304 x 64 -> 304 x 304 x 64 0.757 BF 5 conv 32 1 x 1/ 1 304 x 304 x 64 -> 304 x 304 x 32 0.379 BF 6 conv 64 3 x 3/ 1 304 x 304 x 32 -> 304 x 304 x 64 3.407 BF 7 Shortcut Layer: 4, wt = 0, wn = 0, outputs: 304 x 304 x 64 0.006 BF 8 conv 64 1 x 1/ 1 304 x 304 x 64 -> 304 x 304 x 64 0.757 BF 9 route 8 2 -> 304 x 304 x 128 10 conv 64 1 x 1/ 1 304 x 304 x 128 -> 304 x 304 x 64 1.514 BF 11 conv 128 3 x 3/ 2 304 x 304 x 64 -> 152 x 152 x 128 3.407 BF 12 conv 64 1 x 1/ 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BF 13 route 11 -> 152 x 152 x 128 14 conv 64 1 x 1/ 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BF 15 conv 64 1 x 1/ 1 152 x 152 x 64 -> 152 x 152 x 64 0.189 BF 16 conv 64 3 x 3/ 1 152 x 152 x 64 -> 152 x 152 x 64 1.703 BF 17 Shortcut Layer: 14, wt = 0, wn = 0, outputs: 152 x 152 x 64 0.001 BF 18 conv 64 1 x 1/ 1 152 x 152 x 64 -> 152 x 152 x 64 0.189 BF 19 conv 64 3 x 3/ 1 152 x 152 x 64 -> 152 x 152 x 64 1.703 BF 20 Shortcut Layer: 17, wt = 0, wn = 0, outputs: 152 x 152 x 64 0.001 BF 21 conv 64 1 x 1/ 1 152 x 152 x 64 -> 152 x 152 x 64 0.189 BF 22 route 21 12 -> 152 x 152 x 128 23 conv 128 1 x 1/ 1 152 x 152 x 128 -> 152 x 152 x 128 0.757 BF 24 conv 256 3 x 3/ 2 152 x 152 x 128 -> 76 x 76 x 256 3.407 BF 25 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 26 route 24 -> 76 x 76 x 256 27 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 28 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 29 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 30 Shortcut Layer: 27, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 31 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 32 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 33 Shortcut Layer: 30, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 34 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 35 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 36 Shortcut Layer: 33, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 37 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 38 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 39 Shortcut Layer: 36, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 40 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 41 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 42 Shortcut Layer: 39, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 43 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 44 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 45 Shortcut Layer: 42, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 46 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 47 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 48 Shortcut Layer: 45, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 49 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 50 conv 128 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 128 1.703 BF 51 Shortcut Layer: 48, wt = 0, wn = 0, outputs: 76 x 76 x 128 0.001 BF 52 conv 128 1 x 1/ 1 76 x 76 x 128 -> 76 x 76 x 128 0.189 BF 53 route 52 25 -> 76 x 76 x 256 54 conv 256 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 256 0.757 BF 55 conv 512 3 x 3/ 2 76 x 76 x 256 -> 38 x 38 x 512 3.407 BF 56 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 57 route 55 -> 38 x 38 x 512 58 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 59 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 60 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 61 Shortcut Layer: 58, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 62 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 63 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 64 Shortcut Layer: 61, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 65 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 66 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 67 Shortcut Layer: 64, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 68 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 69 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 70 Shortcut Layer: 67, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 71 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 72 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 73 Shortcut Layer: 70, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 74 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 75 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 76 Shortcut Layer: 73, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 77 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 78 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 79 Shortcut Layer: 76, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 80 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 81 conv 256 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 256 1.703 BF 82 Shortcut Layer: 79, wt = 0, wn = 0, outputs: 38 x 38 x 256 0.000 BF 83 conv 256 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 256 0.189 BF 84 route 83 56 -> 38 x 38 x 512 85 conv 512 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 512 0.757 BF 86 conv 1024 3 x 3/ 2 38 x 38 x 512 -> 19 x 19 x1024 3.407 BF 87 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 88 route 86 -> 19 x 19 x1024 89 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 90 conv 512 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.189 BF 91 conv 512 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x 512 1.703 BF 92 Shortcut Layer: 89, wt = 0, wn = 0, outputs: 19 x 19 x 512 0.000 BF 93 conv 512 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.189 BF 94 conv 512 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x 512 1.703 BF 95 Shortcut Layer: 92, wt = 0, wn = 0, outputs: 19 x 19 x 512 0.000 BF 96 conv 512 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.189 BF 97 conv 512 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x 512 1.703 BF 98 Shortcut Layer: 95, wt = 0, wn = 0, outputs: 19 x 19 x 512 0.000 BF 99 conv 512 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.189 BF 100 conv 512 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x 512 1.703 BF 101 Shortcut Layer: 98, wt = 0, wn = 0, outputs: 19 x 19 x 512 0.000 BF 102 conv 512 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.189 BF 103 route 102 87 -> 19 x 19 x1024 104 conv 1024 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x1024 0.757 BF 105 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 106 conv 1024 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BF 107 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 108 max 5x 5/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.005 BF 109 route 107 -> 19 x 19 x 512 110 max 9x 9/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.015 BF 111 route 107 -> 19 x 19 x 512 112 max 13x13/ 1 19 x 19 x 512 -> 19 x 19 x 512 0.031 BF 113 route 112 110 108 107 -> 19 x 19 x2048 114 conv 512 1 x 1/ 1 19 x 19 x2048 -> 19 x 19 x 512 0.757 BF 115 conv 1024 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BF 116 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 117 conv 256 1 x 1/ 1 19 x 19 x 512 -> 19 x 19 x 256 0.095 BF 118 upsample 2x 19 x 19 x 256 -> 38 x 38 x 256 119 route 85 -> 38 x 38 x 512 120 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 121 route 120 118 -> 38 x 38 x 512 122 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 123 conv 512 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BF 124 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 125 conv 512 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BF 126 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 127 conv 128 1 x 1/ 1 38 x 38 x 256 -> 38 x 38 x 128 0.095 BF 128 upsample 2x 38 x 38 x 128 -> 76 x 76 x 128 129 route 54 -> 76 x 76 x 256 130 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 131 route 130 128 -> 76 x 76 x 256 132 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 133 conv 256 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BF 134 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 135 conv 256 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BF 136 conv 128 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BF 137 conv 256 3 x 3/ 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BF 138 conv 255 1 x 1/ 1 76 x 76 x 256 -> 76 x 76 x 255 0.754 BF 139 yolo 140 route 136 -> 76 x 76 x 128 141 conv 256 3 x 3/ 2 76 x 76 x 128 -> 38 x 38 x 256 0.852 BF 142 route 141 126 -> 38 x 38 x 512 143 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 144 conv 512 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BF 145 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 146 conv 512 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BF 147 conv 256 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BF 148 conv 512 3 x 3/ 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BF 149 conv 255 1 x 1/ 1 38 x 38 x 512 -> 38 x 38 x 255 0.377 BF 150 yolo 151 route 147 -> 38 x 38 x 256 152 conv 512 3 x 3/ 2 38 x 38 x 256 -> 19 x 19 x 512 0.852 BF 153 route 152 116 -> 19 x 19 x1024 154 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 155 conv 1024 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BF 156 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 157 conv 1024 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BF 158 conv 512 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BF 159 conv 1024 3 x 3/ 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BF 160 conv 255 1 x 1/ 1 19 x 19 x1024 -> 19 x 19 x 255 0.189 BF 161 yolo

tuteming commented 2 years ago

我又做了個實驗
我將yolov4.cpp的 padding=0 不變,把// define each layer. 放進原來的yolov4.cfg(padding=1 ) 去跑darknet.exe(a 版) 發現結構(padding=0與padding=1)都一樣,再去跑我的程式正確率都一樣. 顯然padding沒有作用, 難道是USE_FP16造成的錯誤嗎?(應該沒那麼離譜) 所以 //#define USE_FP16 // comment out this if want to use FP32 FP32時間多了一倍 but 錯誤一樣

yolov5.cpp我測試得很好. 但它無須到u-版去轉換darknet的config 及weight 因為gen_wts.py 轉換出現警告訊息 smart bias initialization failure. 你碰過嗎?

wang-xinyu commented 2 years ago

For this [convolution] https://github.com/ultralytics/yolov3/blob/98068efebc699e7a652fb495f3e7a23bf296affd/cfg/yolov4.cfg#L48

[convolutional]
batch_normalize=1
filters=64
size=1
stride=1
pad=1
activation=mish

The pad=1, but the conv layer will have padding=0.

Because https://github.com/ultralytics/yolov3/blob/98068efebc699e7a652fb495f3e7a23bf296affd/models.py#L31

padding=k // 2 if mdef['pad'] else 0,
tuteming commented 2 years ago

你的意思是將pad=1->pad=0 就不會出現smart bias initialization failure. 轉出的yolov4.wts再用yolov4.exe->yolov4.engine就ok嗎? 謝謝 我試試

tuteming commented 2 years ago

同樣出現3次 WARNING: smart bias initialization failure. WARNING: smart bias initialization failure. WARNING: smart bias initialization failure. 結果一樣 總之謝謝你

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.