ceccocats / tkDNN

Deep neural network library and toolkit to do high performace inference on NVIDIA jetson platforms
GNU General Public License v2.0
718 stars 209 forks source link

Failing exporting efficientnet-lite3 #140

Closed AIgiraffe closed 3 years ago

AIgiraffe commented 4 years ago

darknet: ./src/darknet.c:522: run_export: Assertion `0 == "layer type not supported for export"' failed. Aborted (core dumped) layer filters size/strd(dil) input output 0 conv 16 3 x 3/ 2 608 x 608 x 3 -> 304 x 304 x 16 0.080 BF 1 conv 16 1 x 1/ 1 304 x 304 x 16 -> 304 x 304 x 16 0.047 BF 2 conv 16/ 16 3 x 3/ 1 304 x 304 x 16 -> 304 x 304 x 16 0.027 BF 3 conv 8 1 x 1/ 1 304 x 304 x 16 -> 304 x 304 x 8 0.024 BF 4 conv 16 1 x 1/ 1 304 x 304 x 8 -> 304 x 304 x 16 0.024 BF 5 conv 16/ 16 3 x 3/ 1 304 x 304 x 16 -> 304 x 304 x 16 0.027 BF 6 conv 8 1 x 1/ 1 304 x 304 x 16 -> 304 x 304 x 8 0.024 BF 7 dropout p = 0.200 739328 -> 739328 8 Shortcut Layer: 3, wt = 0, wn = 0, outputs: 304 x 304 x 8 0.001 BF 9 conv 48 1 x 1/ 1 304 x 304 x 8 -> 304 x 304 x 48 0.071 BF 10 conv 48/ 48 3 x 3/ 2 304 x 304 x 48 -> 152 x 152 x 48 0.020 BF 11 conv 16 1 x 1/ 1 152 x 152 x 48 -> 152 x 152 x 16 0.035 BF 12 conv 64 1 x 1/ 1 152 x 152 x 16 -> 152 x 152 x 64 0.047 BF 13 conv 64/ 64 3 x 3/ 1 152 x 152 x 64 -> 152 x 152 x 64 0.027 BF 14 conv 16 1 x 1/ 1 152 x 152 x 64 -> 152 x 152 x 16 0.047 BF 15 dropout p = 0.200 369664 -> 369664 16 Shortcut Layer: 11, wt = 0, wn = 0, outputs: 152 x 152 x 16 0.000 BF 17 conv 64 1 x 1/ 1 152 x 152 x 16 -> 152 x 152 x 64 0.047 BF 18 conv 64/ 64 3 x 3/ 1 152 x 152 x 64 -> 152 x 152 x 64 0.027 BF 19 conv 16 1 x 1/ 1 152 x 152 x 64 -> 152 x 152 x 16 0.047 BF 20 dropout p = 0.200 369664 -> 369664 21 Shortcut Layer: 16, wt = 0, wn = 0, outputs: 152 x 152 x 16 0.000 BF 22 conv 64 1 x 1/ 1 152 x 152 x 16 -> 152 x 152 x 64 0.047 BF 23 conv 64/ 64 3 x 3/ 2 152 x 152 x 64 -> 76 x 76 x 64 0.007 BF 24 conv 16 1 x 1/ 1 76 x 76 x 64 -> 76 x 76 x 16 0.012 BF 25 conv 96 1 x 1/ 1 76 x 76 x 16 -> 76 x 76 x 96 0.018 BF 26 conv 96/ 96 3 x 3/ 1 76 x 76 x 96 -> 76 x 76 x 96 0.010 BF 27 conv 16 1 x 1/ 1 76 x 76 x 96 -> 76 x 76 x 16 0.018 BF 28 dropout p = 0.200 92416 -> 92416 29 Shortcut Layer: 24, wt = 0, wn = 0, outputs: 76 x 76 x 16 0.000 BF 30 conv 96 1 x 1/ 1 76 x 76 x 16 -> 76 x 76 x 96 0.018 BF 31 conv 96/ 96 3 x 3/ 1 76 x 76 x 96 -> 76 x 76 x 96 0.010 BF 32 conv 16 1 x 1/ 1 76 x 76 x 96 -> 76 x 76 x 16 0.018 BF 33 dropout p = 0.200 92416 -> 92416 34 Shortcut Layer: 29, wt = 0, wn = 0, outputs: 76 x 76 x 16 0.000 BF 35 conv 96 1 x 1/ 1 76 x 76 x 16 -> 76 x 76 x 96 0.018 BF 36 conv 96/ 96 3 x 3/ 1 76 x 76 x 96 -> 76 x 76 x 96 0.010 BF 37 conv 32 1 x 1/ 1 76 x 76 x 96 -> 76 x 76 x 32 0.035 BF 38 conv 192 1 x 1/ 1 76 x 76 x 32 -> 76 x 76 x 192 0.071 BF 39 conv 192/ 192 3 x 3/ 1 76 x 76 x 192 -> 76 x 76 x 192 0.020 BF 40 conv 32 1 x 1/ 1 76 x 76 x 192 -> 76 x 76 x 32 0.071 BF 41 dropout p = 0.200 184832 -> 184832 42 Shortcut Layer: 37, wt = 0, wn = 0, outputs: 76 x 76 x 32 0.000 BF 43 conv 192 1 x 1/ 1 76 x 76 x 32 -> 76 x 76 x 192 0.071 BF 44 conv 192/ 192 3 x 3/ 1 76 x 76 x 192 -> 76 x 76 x 192 0.020 BF 45 conv 32 1 x 1/ 1 76 x 76 x 192 -> 76 x 76 x 32 0.071 BF 46 dropout p = 0.200 184832 -> 184832 47 Shortcut Layer: 42, wt = 0, wn = 0, outputs: 76 x 76 x 32 0.000 BF 48 conv 192 1 x 1/ 1 76 x 76 x 32 -> 76 x 76 x 192 0.071 BF 49 conv 192/ 192 3 x 3/ 1 76 x 76 x 192 -> 76 x 76 x 192 0.020 BF 50 conv 32 1 x 1/ 1 76 x 76 x 192 -> 76 x 76 x 32 0.071 BF 51 dropout p = 0.200 184832 -> 184832 52 Shortcut Layer: 47, wt = 0, wn = 0, outputs: 76 x 76 x 32 0.000 BF 53 conv 192 1 x 1/ 1 76 x 76 x 32 -> 76 x 76 x 192 0.071 BF 54 conv 192/ 192 3 x 3/ 1 76 x 76 x 192 -> 76 x 76 x 192 0.020 BF 55 conv 32 1 x 1/ 1 76 x 76 x 192 -> 76 x 76 x 32 0.071 BF 56 dropout p = 0.200 184832 -> 184832 57 Shortcut Layer: 52, wt = 0, wn = 0, outputs: 76 x 76 x 32 0.000 BF 58 conv 192 1 x 1/ 1 76 x 76 x 32 -> 76 x 76 x 192 0.071 BF 59 conv 192/ 192 3 x 3/ 2 76 x 76 x 192 -> 38 x 38 x 192 0.005 BF 60 conv 48 1 x 1/ 1 38 x 38 x 192 -> 38 x 38 x 48 0.027 BF 61 conv 272 1 x 1/ 1 38 x 38 x 48 -> 38 x 38 x 272 0.038 BF 62 conv 272/ 272 3 x 3/ 1 38 x 38 x 272 -> 38 x 38 x 272 0.007 BF 63 conv 48 1 x 1/ 1 38 x 38 x 272 -> 38 x 38 x 48 0.038 BF 64 dropout p = 0.200 69312 -> 69312 65 Shortcut Layer: 60, wt = 0, wn = 0, outputs: 38 x 38 x 48 0.000 BF 66 conv 272 1 x 1/ 1 38 x 38 x 48 -> 38 x 38 x 272 0.038 BF 67 conv 272/ 272 3 x 3/ 1 38 x 38 x 272 -> 38 x 38 x 272 0.007 BF 68 conv 48 1 x 1/ 1 38 x 38 x 272 -> 38 x 38 x 48 0.038 BF 69 dropout p = 0.200 69312 -> 69312 70 Shortcut Layer: 65, wt = 0, wn = 0, outputs: 38 x 38 x 48 0.000 BF 71 conv 272 1 x 1/ 1 38 x 38 x 48 -> 38 x 38 x 272 0.038 BF 72 conv 272/ 272 3 x 3/ 1 38 x 38 x 272 -> 38 x 38 x 272 0.007 BF 73 conv 48 1 x 1/ 1 38 x 38 x 272 -> 38 x 38 x 48 0.038 BF 74 dropout p = 0.200 69312 -> 69312 75 Shortcut Layer: 70, wt = 0, wn = 0, outputs: 38 x 38 x 48 0.000 BF 76 conv 272 1 x 1/ 1 38 x 38 x 48 -> 38 x 38 x 272 0.038 BF 77 conv 272/ 272 3 x 3/ 1 38 x 38 x 272 -> 38 x 38 x 272 0.007 BF 78 conv 48 1 x 1/ 1 38 x 38 x 272 -> 38 x 38 x 48 0.038 BF 79 dropout p = 0.200 69312 -> 69312 80 Shortcut Layer: 75, wt = 0, wn = 0, outputs: 38 x 38 x 48 0.000 BF 81 conv 272 1 x 1/ 1 38 x 38 x 48 -> 38 x 38 x 272 0.038 BF 82 conv 272/ 272 3 x 3/ 2 38 x 38 x 272 -> 19 x 19 x 272 0.002 BF 83 conv 96 1 x 1/ 1 19 x 19 x 272 -> 19 x 19 x 96 0.019 BF 84 conv 448 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 448 0.031 BF 85 conv 448/ 448 3 x 3/ 1 19 x 19 x 448 -> 19 x 19 x 448 0.003 BF 86 conv 96 1 x 1/ 1 19 x 19 x 448 -> 19 x 19 x 96 0.031 BF 87 dropout p = 0.200 34656 -> 34656 88 Shortcut Layer: 83, wt = 0, wn = 0, outputs: 19 x 19 x 96 0.000 BF 89 conv 448 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 448 0.031 BF 90 conv 448/ 448 3 x 3/ 1 19 x 19 x 448 -> 19 x 19 x 448 0.003 BF 91 conv 96 1 x 1/ 1 19 x 19 x 448 -> 19 x 19 x 96 0.031 BF 92 dropout p = 0.200 34656 -> 34656 93 Shortcut Layer: 88, wt = 0, wn = 0, outputs: 19 x 19 x 96 0.000 BF 94 conv 448 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 448 0.031 BF 95 conv 448/ 448 3 x 3/ 1 19 x 19 x 448 -> 19 x 19 x 448 0.003 BF 96 conv 96 1 x 1/ 1 19 x 19 x 448 -> 19 x 19 x 96 0.031 BF 97 dropout p = 0.200 34656 -> 34656 98 Shortcut Layer: 93, wt = 0, wn = 0, outputs: 19 x 19 x 96 0.000 BF 99 conv 448 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 448 0.031 BF 100 conv 448/ 448 3 x 3/ 1 19 x 19 x 448 -> 19 x 19 x 448 0.003 BF 101 conv 96 1 x 1/ 1 19 x 19 x 448 -> 19 x 19 x 96 0.031 BF 102 dropout p = 0.200 34656 -> 34656 103 Shortcut Layer: 98, wt = 0, wn = 0, outputs: 19 x 19 x 96 0.000 BF 104 conv 448 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 448 0.031 BF 105 conv 448/ 448 3 x 3/ 1 19 x 19 x 448 -> 19 x 19 x 448 0.003 BF 106 conv 96 1 x 1/ 1 19 x 19 x 448 -> 19 x 19 x 96 0.031 BF 107 dropout p = 0.200 34656 -> 34656 108 Shortcut Layer: 103, wt = 0, wn = 0, outputs: 19 x 19 x 96 0.000 BF 109 conv 96 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 96 0.007 BF 110 conv 96/ 96 5 x 5/ 1 19 x 19 x 96 -> 19 x 19 x 96 0.002 BF 111 conv 128 1 x 1/ 1 19 x 19 x 96 -> 19 x 19 x 128 0.009 BF 112 conv 128/ 128 5 x 5/ 1 19 x 19 x 128 -> 19 x 19 x 128 0.002 BF 113 conv 128 1 x 1/ 1 19 x 19 x 128 -> 19 x 19 x 128 0.012 BF 114 conv 75 1 x 1/ 1 19 x 19 x 128 -> 19 x 19 x 75 0.007 BF 115 yolo [yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.00 nms_kind: greedynms (1), beta = 0.600000

the droupt layer causes the error, How can i fix it? thank you!

mive93 commented 3 years ago

Can you give more details? What's that output?

mive93 commented 3 years ago

Closing for now, feel free to reopen.