Closed mairkiss closed 1 year ago
Your log is hard to read, please upload it as txt file.
At a fisrt glance looks like your plugin is running in INT8 but don't have the int8 scale. can you double-check it?
thanks,this is my log: set kINT8 at layer[0]Conv_0[Conv] set kINT8 at layer[1]Relu_1[Relu] set kINT8 at layer[2]Conv_2[Conv] set kINT8 at layer[3]Relu_3[Relu] set kINT8 at layer[4]Conv_4[Conv] set kINT8 at layer[5]Relu_5[Relu] set kINT8 at layer[6]Conv_6[Conv] set kINT8 at layer[7]Relu_7[Relu] set kINT8 at layer[8]Conv_8[Conv] set kINT8 at layer[9]Relu_9[Relu] set kINT8 at layer[10]Add_10[ElementWise1] set kINT8 at layer[11]Conv_11[Conv] set kINT8 at layer[12]Relu_12[Relu] set kINT8 at layer[13]Conv_13[Conv] set kINT8 at layer[14]Relu_14[Relu] set kINT8 at layer[15]Concat_15[Concat] set kINT8 at layer[16]Conv_16[Conv] set kINT8 at layer[17]Relu_17[Relu] set kINT8 at layer[18]Conv_18[Conv] set kINT8 at layer[19]Relu_19[Relu] set kINT8 at layer[20]Conv_20[Conv] set kINT8 at layer[21]Add_21[ElementWise1] set kINT8 at layer[22]Relu_22[Relu] set kINT8 at layer[23]Conv_23[Conv] set kINT8 at layer[24]Relu_24[Relu] set kINT8 at layer[25]Conv_25[Conv] set kINT8 at layer[26]Add_26[ElementWise1] set kINT8 at layer[27]Relu_27[Relu] set kINT8 at layer[28]Conv_28[Conv] set kINT8 at layer[29]Relu_29[Relu] set kINT8 at layer[30]Conv_30[Conv] set kINT8 at layer[31]Add_31[ElementWise1] set kINT8 at layer[32]Relu_32[Relu] set kINT8 at layer[33]Conv_33[Conv] set kINT8 at layer[34]Relu_34[Relu] set kINT8 at layer[35]Conv_35[Conv] set kINT8 at layer[36]Conv_36[Conv] set kINT8 at layer[37]Add_37[ElementWise1] set kINT8 at layer[38]Relu_38[Relu] set kINT8 at layer[39]Conv_39[Conv] set kINT8 at layer[40]Relu_40[Relu] set kINT8 at layer[41]Conv_41[Conv] set kINT8 at layer[42]Add_42[ElementWise1] set kINT8 at layer[43]Relu_43[Relu] set kINT8 at layer[44]Conv_44[Conv] set kINT8 at layer[45]Relu_45[Relu] set kINT8 at layer[46]Conv_46[Conv] set kINT8 at layer[47]Add_47[ElementWise1] set kINT8 at layer[48]Relu_48[Relu] set kINT8 at layer[49]Conv_49[Conv] set kINT8 at layer[50]Relu_50[Relu] set kINT8 at layer[51]Conv_51[Conv] set kINT8 at layer[52]Add_52[ElementWise1] set kINT8 at layer[53]Relu_53[Relu] set kINT8 at layer[54]Conv_54[Conv] set kINT8 at layer[55]Relu_55[Relu] set kINT8 at layer[56]Conv_56[Conv] set kINT8 at layer[57]Conv_57[Conv] set kINT8 at layer[58]Add_58[ElementWise1] set kINT8 at layer[59]Relu_59[Relu] set kINT8 at layer[60]Conv_60[Conv] set kINT8 at layer[61]Relu_61[Relu] set kINT8 at layer[62]Conv_62[Conv] set kINT8 at layer[63]Add_63[ElementWise1] set kINT8 at layer[64]Relu_64[Relu] set kINT8 at layer[65]Conv_65[Conv] set kINT8 at layer[66]Relu_66[Relu] set kINT8 at layer[67]Conv_67[Conv] set kINT8 at layer[68]Add_68[ElementWise1] set kINT8 at layer[69]Relu_69[Relu] set kINT8 at layer[70]Conv_70[Conv] set kINT8 at layer[71]Relu_71[Relu] set kINT8 at layer[72]Conv_72[Conv] set kINT8 at layer[73]Add_73[ElementWise1] set kINT8 at layer[74]Relu_74[Relu] set kINT8 at layer[75]Conv_75[Conv] set kINT8 at layer[76]Relu_76[Relu] set kINT8 at layer[77]Conv_77[Conv] set kINT8 at layer[78]Add_78[ElementWise1] set kINT8 at layer[79]Relu_79[Relu] set kINT8 at layer[80]Conv_80[Conv] set kINT8 at layer[81]Relu_81[Relu] set kINT8 at layer[82]Conv_82[Conv] set kINT8 at layer[83]Add_83[ElementWise1] set kINT8 at layer[84]Relu_84[Relu] set kINT8 at layer[85]Conv_85[Conv] set kINT8 at layer[86]Relu_86[Relu] set kINT8 at layer[87]Conv_87[Conv] set kINT8 at layer[88]Conv_88[Conv] set kINT8 at layer[89]Add_89[ElementWise1] set kINT8 at layer[90]Relu_90[Relu] set kINT8 at layer[91]Conv_91[Conv] set kINT8 at layer[92]Relu_92[Relu] set kINT8 at layer[93]Conv_93[Conv] set kINT8 at layer[94]Add_94[ElementWise1] set kINT8 at layer[95]Relu_95[Relu] set kINT8 at layer[96]Conv_96[Conv] set kINT8 at layer[97]Relu_97[Relu] set kINT8 at layer[98]Conv_98[Conv] set kINT8 at layer[99]Add_99[ElementWise1] set kINT8 at layer[100]Relu_100[Relu] set kINT8 at layer[101](Unnamed Layer 101) [Shuffle] set kINT8 at layer[102]Reshape_103[Reshape] set kINT8 at layer[103](Unnamed Layer 103) [Shuffle] set kINT8 at layer[104]Reshape_106[Reshape] set kINT8 at layer[105](Unnamed Layer 105) [Shuffle] set kINT8 at layer[106]Reshape_109[Reshape] set kINT8 at layer[107](Unnamed Layer 107) [Shuffle] set kINT8 at layer[108]Reshape_112[Reshape] set kINT8 at layer[109](Unnamed Layer 109) [Shuffle] set kINT8 at layer[110]Reshape_115[Reshape] set kINT8 at layer[111](Unnamed Layer 111) [Shuffle] set kINT8 at layer[112]Reshape_118[Reshape] set kINT8 at layer[113]Conv_119[Conv] set kINT8 at layer[114]Split_120[SplitPlugin] set kINT8 at layer[115]Concat_121[Concat] set kINT8 at layer[116]Sigmoid_122[Sigmoid] set kINT8 at layer[117]DCNv2_123[DCNv2] set kINT8 at layer[118]Upsample_124[Plugin.Upsample] set kINT8 at layer[119]Conv_125[Conv] set kINT8 at layer[120]Split_126[SplitPlugin] set kINT8 at layer[121]Concat_127[Concat] set kINT8 at layer[122]Sigmoid_128[Sigmoid] set kINT8 at layer[123]DCNv2_129[DCNv2] set kINT8 at layer[124]Add_130[ElementWise1] set kINT8 at layer[125]Upsample_131[Plugin.Upsample] set kINT8 at layer[126]Conv_132[Conv] set kINT8 at layer[127]Split_133[SplitPlugin] set kINT8 at layer[128]Concat_134[Concat] set kINT8 at layer[129]Sigmoid_135[Sigmoid] set kINT8 at layer[130]DCNv2_136[DCNv2] set kINT8 at layer[131]Add_137[ElementWise1] set kINT8 at layer[132]Conv_138[Conv] set kINT8 at layer[133]Split_139[SplitPlugin] set kINT8 at layer[134]Concat_140[Concat] set kINT8 at layer[135]Sigmoid_141[Sigmoid] set kINT8 at layer[136]DCNv2_142[DCNv2] set kINT8 at layer[137]Conv_143[Conv] set kINT8 at layer[138]Relu_144[Relu] set kINT8 at layer[139]Conv_145[Conv] set kINT8 at layer[140]Relu_146[Relu] set kINT8 at layer[141]Conv_147[Conv] set kINT8 at layer[142]Relu_148[Relu] set kINT8 at layer[143]Conv_149[Conv] set kINT8 at layer[144]Relu_150[Relu] set kINT8 at layer[145]Conv_151[Conv] set kINT8 at layer[146]Conv_152[Conv] set kINT8 at layer[147]Relu_153[Relu] set kINT8 at layer[148]Conv_154[Conv] set kINT8 at layer[149]Relu_155[Relu] set kINT8 at layer[150]Conv_156[Conv] set kINT8 at layer[151]Conv_157[Conv] set kINT8 at layer[152]Relu_158[Relu] set kINT8 at layer[153]Conv_159[Conv] set kINT8 at layer[154]Relu_160[Relu] set kINT8 at layer[155]Conv_161[Conv] set kINT8 at layer[156]Relu_162[Relu] set kINT8 at layer[157]Conv_163[Conv] set kINT8 at layer[158]Relu_164[Relu] set kINT8 at layer[159]Conv_165[Conv] set kINT8 at layer[160]Conv_166[Conv] set kINT8 at layer[161]Relu_167[Relu] set kINT8 at layer[162]Conv_168[Conv] set kINT8 at layer[163]Relu_169[Relu] set kINT8 at layer[164]Conv_170[Conv] set kINT8 at layer[165]Conv_171[Conv] set kINT8 at layer[166]Relu_172[Relu] set kINT8 at layer[167]Conv_173[Conv] set kINT8 at layer[168]Relu_174[Relu] set kINT8 at layer[169]Conv_175[Conv] set kINT8 at layer[170]Relu_176[Relu] set kINT8 at layer[171]Conv_177[Conv] set kINT8 at layer[172]Relu_178[Relu] set kINT8 at layer[173]Conv_179[Conv] set kINT8 at layer[174]Conv_180[Conv] set kINT8 at layer[175]Relu_181[Relu] set kINT8 at layer[176]Conv_182[Conv] set kINT8 at layer[177]Relu_183[Relu] set kINT8 at layer[178]Conv_184[Conv] set kINT8 at layer[179]Conv_185[Conv] set kINT8 at layer[180]Relu_186[Relu] set kINT8 at layer[181]Conv_187[Conv] set kINT8 at layer[182]Relu_188[Relu] set kINT8 at layer[183]Conv_189[Conv] set kINT8 at layer[184]Conv_190[Conv] set kINT8 at layer[185]Relu_191[Relu] set kINT8 at layer[186]Conv_192[Conv] set kINT8 at layer[187]Relu_193[Relu] set kINT8 at layer[188]Conv_194[Conv] set kINT8 at layer[189]Conv_195[Conv] set kINT8 at layer[190]Relu_196[Relu] set kINT8 at layer[191]Conv_197[Conv] set kINT8 at layer[192]Relu_198[Relu] set kINT8 at layer[193]Conv_199[Conv] set kINT8 at layer[194]Concat_200[Concat] [2022-11-08 06:01:47 WARNING] Half2 support requested on hardware without native FP16 support, performance will be negatively affected. [2022-11-08 06:01:47 WARNING] Convolution + generic activation fusion is disable due to incompatible driver or nvrtc [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_0[Conv] + Relu_1[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_2[Conv] + Relu_3[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_4[Conv] + Relu_5[Relu] || Conv_13[Conv] + Relu_14[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_6[Conv] + Relu_7[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_8[Conv] + Relu_9[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Add_10[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_11[Conv] + Relu_12[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_16[Conv] + Relu_17[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_18[Conv] + Relu_19[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_20[Conv] + Add_21[ElementWise1] + Relu_22[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_23[Conv] + Relu_24[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_25[Conv] + Add_26[ElementWise1] + Relu_27[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_28[Conv] + Relu_29[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_30[Conv] + Add_31[ElementWise1] + Relu_32[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_33[Conv] + Relu_34[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_36[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_35[Conv] + Add_37[ElementWise1] + Relu_38[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_39[Conv] + Relu_40[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_41[Conv] + Add_42[ElementWise1] + Relu_43[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_44[Conv] + Relu_45[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_46[Conv] + Add_47[ElementWise1] + Relu_48[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_49[Conv] + Relu_50[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_51[Conv] + Add_52[ElementWise1] + Relu_53[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_54[Conv] + Relu_55[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_57[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_56[Conv] + Add_58[ElementWise1] + Relu_59[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_60[Conv] + Relu_61[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_62[Conv] + Add_63[ElementWise1] + Relu_64[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_65[Conv] + Relu_66[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_67[Conv] + Add_68[ElementWise1] + Relu_69[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_70[Conv] + Relu_71[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_72[Conv] + Add_73[ElementWise1] + Relu_74[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_75[Conv] + Relu_76[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_77[Conv] + Add_78[ElementWise1] + Relu_79[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_80[Conv] + Relu_81[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_82[Conv] + Add_83[ElementWise1] + Relu_84[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_85[Conv] + Relu_86[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_88[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_87[Conv] + Add_89[ElementWise1] + Relu_90[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_91[Conv] + Relu_92[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_93[Conv] + Add_94[ElementWise1] + Relu_95[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_96[Conv] + Relu_97[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_98[Conv] + Add_99[ElementWise1] + Relu_100[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 101) [Shuffle] + Reshape_103[Reshape] + (Unnamed Layer 103) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_106[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 105) [Shuffle] + Reshape_109[Reshape] + (Unnamed Layer 107) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_112[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 109) [Shuffle] + Reshape_115[Reshape] + (Unnamed Layer 111) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_118[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_119[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_120[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_122[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_123[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Upsample_124[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_125[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_126[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_128[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_129[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Add_130[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Upsample_131[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_132[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_133[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_135[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_136[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Add_137[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_138[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_139[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_141[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_142[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_143[Conv] + Relu_144[Relu] || Conv_152[Conv] + Relu_153[Relu] || Conv_157[Conv] + Relu_158[Relu] || Conv_166[Conv] + Relu_167[Relu] || Conv_171[Conv] + Relu_172[Relu] || Conv_180[Conv] + Relu_181[Relu] || Conv_185[Conv] + Relu_186[Relu] || Conv_190[Conv] + Relu_191[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_145[Conv] + Relu_146[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_147[Conv] + Relu_148[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_149[Conv] + Relu_150[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_151[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_154[Conv] + Relu_155[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_156[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_159[Conv] + Relu_160[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_161[Conv] + Relu_162[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_163[Conv] + Relu_164[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_165[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_168[Conv] + Relu_169[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_170[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_173[Conv] + Relu_174[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_175[Conv] + Relu_176[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_177[Conv] + Relu_178[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_179[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_182[Conv] + Relu_183[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_184[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_187[Conv] + Relu_188[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_189[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_192[Conv] + Relu_193[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_194[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_195[Conv] + Relu_196[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_197[Conv] + Relu_198[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_199[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. getBatch1[0] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 getBatch1[1] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 getBatch1[2] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 729, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 732, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 735, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 738, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 741, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 744, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 747, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 469, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 750, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 753, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 756, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 478, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 759, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 762, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 485, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 765, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 768, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 492, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 771, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 774, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 501, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 780, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 783, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 508, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 786, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 789, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 515, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 792, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 795, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 522, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 798, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 801, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 531, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 807, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 810, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 538, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 813, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 816, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 545, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 819, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 822, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 552, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 825, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 828, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 559, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 831, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 834, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 566, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 837, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 840, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 575, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 846, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 849, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 582, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 852, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 855, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 589, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 591, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 593, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 597, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 599, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 603, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 605, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 613, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 624, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 636, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 644, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 858, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 861, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 864, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 867, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 870, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 873, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 876, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 879, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 882, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 885, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 888, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 891, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 894, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 897, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 900, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 903, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 906, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 909, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 912, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 915, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 918, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 921, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 924, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 927, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 728, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Detected invalid timing cache, setup a local cache instead [2022-11-08 06:02:05 WARNING] No implementation of layer Split_120[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer DCNv2_123[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Upsample_124[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Split_126[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer DCNv2_129[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Upsample_131[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Split_133[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer DCNv2_136[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer Split_139[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer DCNv2_142[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_151[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_156[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_165[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_170[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_179[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_184[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_189[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_194[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_199[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation obeys reformatting-free rules, at least 1 reformatting nodes are needed, now picking the fastest path instead. [2022-11-08 06:02:07 ERROR] 2: [pluginV2Runner.cpp::getInputHostScale::88] Error Code 2: Internal Error (Assertion scales.size() == 1 failed.) terminate called after throwing an instance of 'std::runtime_error' what(): Failed to create object
Please provide the repro steps so that we can investigate this.
Closing since no activity for more than 3 weeks, please reopen if you still have question, thanks!
Description
i use tensorrt 8.3 to accomplish ptq quanzation for my model,but encountered error,the detailed log is as follows , I don't know what this error means.
set kINT8 at layer[0]Conv_0[Conv] set kINT8 at layer[1]Relu_1[Relu] set kINT8 at layer[2]Conv_2[Conv] set kINT8 at layer[3]Relu_3[Relu] set kINT8 at layer[4]Conv_4[Conv] set kINT8 at layer[5]Relu_5[Relu] set kINT8 at layer[6]Conv_6[Conv] set kINT8 at layer[7]Relu_7[Relu] set kINT8 at layer[8]Conv_8[Conv] set kINT8 at layer[9]Relu_9[Relu] set kINT8 at layer[10]Add_10[ElementWise1] set kINT8 at layer[11]Conv_11[Conv] set kINT8 at layer[12]Relu_12[Relu] set kINT8 at layer[13]Conv_13[Conv] set kINT8 at layer[14]Relu_14[Relu] set kINT8 at layer[15]Concat_15[Concat] set kINT8 at layer[16]Conv_16[Conv] set kINT8 at layer[17]Relu_17[Relu] set kINT8 at layer[18]Conv_18[Conv] set kINT8 at layer[19]Relu_19[Relu] set kINT8 at layer[20]Conv_20[Conv] set kINT8 at layer[21]Add_21[ElementWise1] set kINT8 at layer[22]Relu_22[Relu] set kINT8 at layer[23]Conv_23[Conv] set kINT8 at layer[24]Relu_24[Relu] set kINT8 at layer[25]Conv_25[Conv] set kINT8 at layer[26]Add_26[ElementWise1] set kINT8 at layer[27]Relu_27[Relu] set kINT8 at layer[28]Conv_28[Conv] set kINT8 at layer[29]Relu_29[Relu] set kINT8 at layer[30]Conv_30[Conv] set kINT8 at layer[31]Add_31[ElementWise1] set kINT8 at layer[32]Relu_32[Relu] set kINT8 at layer[33]Conv_33[Conv] set kINT8 at layer[34]Relu_34[Relu] set kINT8 at layer[35]Conv_35[Conv] set kINT8 at layer[36]Conv_36[Conv] set kINT8 at layer[37]Add_37[ElementWise1] set kINT8 at layer[38]Relu_38[Relu] set kINT8 at layer[39]Conv_39[Conv] set kINT8 at layer[40]Relu_40[Relu] set kINT8 at layer[41]Conv_41[Conv] set kINT8 at layer[42]Add_42[ElementWise1] set kINT8 at layer[43]Relu_43[Relu] set kINT8 at layer[44]Conv_44[Conv] set kINT8 at layer[45]Relu_45[Relu] set kINT8 at layer[46]Conv_46[Conv] set kINT8 at layer[47]Add_47[ElementWise1] set kINT8 at layer[48]Relu_48[Relu] set kINT8 at layer[49]Conv_49[Conv] set kINT8 at layer[50]Relu_50[Relu] set kINT8 at layer[51]Conv_51[Conv] set kINT8 at layer[52]Add_52[ElementWise1] set kINT8 at layer[53]Relu_53[Relu] set kINT8 at layer[54]Conv_54[Conv] set kINT8 at layer[55]Relu_55[Relu] set kINT8 at layer[56]Conv_56[Conv] set kINT8 at layer[57]Conv_57[Conv] set kINT8 at layer[58]Add_58[ElementWise1] set kINT8 at layer[59]Relu_59[Relu] set kINT8 at layer[60]Conv_60[Conv] set kINT8 at layer[61]Relu_61[Relu] set kINT8 at layer[62]Conv_62[Conv] set kINT8 at layer[63]Add_63[ElementWise1] set kINT8 at layer[64]Relu_64[Relu] set kINT8 at layer[65]Conv_65[Conv] set kINT8 at layer[66]Relu_66[Relu] set kINT8 at layer[67]Conv_67[Conv] set kINT8 at layer[68]Add_68[ElementWise1] set kINT8 at layer[69]Relu_69[Relu] set kINT8 at layer[70]Conv_70[Conv] set kINT8 at layer[71]Relu_71[Relu] set kINT8 at layer[72]Conv_72[Conv] set kINT8 at layer[73]Add_73[ElementWise1] set kINT8 at layer[74]Relu_74[Relu] set kINT8 at layer[75]Conv_75[Conv] set kINT8 at layer[76]Relu_76[Relu] set kINT8 at layer[77]Conv_77[Conv] set kINT8 at layer[78]Add_78[ElementWise1] set kINT8 at layer[79]Relu_79[Relu] set kINT8 at layer[80]Conv_80[Conv] set kINT8 at layer[81]Relu_81[Relu] set kINT8 at layer[82]Conv_82[Conv] set kINT8 at layer[83]Add_83[ElementWise1] set kINT8 at layer[84]Relu_84[Relu] set kINT8 at layer[85]Conv_85[Conv] set kINT8 at layer[86]Relu_86[Relu] set kINT8 at layer[87]Conv_87[Conv] set kINT8 at layer[88]Conv_88[Conv] set kINT8 at layer[89]Add_89[ElementWise1] set kINT8 at layer[90]Relu_90[Relu] set kINT8 at layer[91]Conv_91[Conv] set kINT8 at layer[92]Relu_92[Relu] set kINT8 at layer[93]Conv_93[Conv] set kINT8 at layer[94]Add_94[ElementWise1] set kINT8 at layer[95]Relu_95[Relu] set kINT8 at layer[96]Conv_96[Conv] set kINT8 at layer[97]Relu_97[Relu] set kINT8 at layer[98]Conv_98[Conv] set kINT8 at layer[99]Add_99[ElementWise1] set kINT8 at layer[100]Relu_100[Relu] set kINT8 at layer[101](Unnamed Layer 101) [Shuffle] set kINT8 at layer[102]Reshape_103[Reshape] set kINT8 at layer[103](Unnamed Layer 103) [Shuffle] set kINT8 at layer[104]Reshape_106[Reshape] set kINT8 at layer[105](Unnamed Layer 105) [Shuffle] set kINT8 at layer[106]Reshape_109[Reshape] set kINT8 at layer[107](Unnamed Layer 107) [Shuffle] set kINT8 at layer[108]Reshape_112[Reshape] set kINT8 at layer[109](Unnamed Layer 109) [Shuffle] set kINT8 at layer[110]Reshape_115[Reshape] set kINT8 at layer[111](Unnamed Layer 111) [Shuffle] set kINT8 at layer[112]Reshape_118[Reshape] set kINT8 at layer[113]Conv_119[Conv] set kINT8 at layer[114]Split_120[SplitPlugin] set kINT8 at layer[115]Concat_121[Concat] set kINT8 at layer[116]Sigmoid_122[Sigmoid] set kINT8 at layer[117]DCNv2_123[DCNv2] set kINT8 at layer[118]Upsample_124[Plugin.Upsample] set kINT8 at layer[119]Conv_125[Conv] set kINT8 at layer[120]Split_126[SplitPlugin] set kINT8 at layer[121]Concat_127[Concat] set kINT8 at layer[122]Sigmoid_128[Sigmoid] set kINT8 at layer[123]DCNv2_129[DCNv2] set kINT8 at layer[124]Add_130[ElementWise1] set kINT8 at layer[125]Upsample_131[Plugin.Upsample] set kINT8 at layer[126]Conv_132[Conv] set kINT8 at layer[127]Split_133[SplitPlugin] set kINT8 at layer[128]Concat_134[Concat] set kINT8 at layer[129]Sigmoid_135[Sigmoid] set kINT8 at layer[130]DCNv2_136[DCNv2] set kINT8 at layer[131]Add_137[ElementWise1] set kINT8 at layer[132]Conv_138[Conv] set kINT8 at layer[133]Split_139[SplitPlugin] set kINT8 at layer[134]Concat_140[Concat] set kINT8 at layer[135]Sigmoid_141[Sigmoid] set kINT8 at layer[136]DCNv2_142[DCNv2] set kINT8 at layer[137]Conv_143[Conv] set kINT8 at layer[138]Relu_144[Relu] set kINT8 at layer[139]Conv_145[Conv] set kINT8 at layer[140]Relu_146[Relu] set kINT8 at layer[141]Conv_147[Conv] set kINT8 at layer[142]Relu_148[Relu] set kINT8 at layer[143]Conv_149[Conv] set kINT8 at layer[144]Relu_150[Relu] set kINT8 at layer[145]Conv_151[Conv] set kINT8 at layer[146]Conv_152[Conv] set kINT8 at layer[147]Relu_153[Relu] set kINT8 at layer[148]Conv_154[Conv] set kINT8 at layer[149]Relu_155[Relu] set kINT8 at layer[150]Conv_156[Conv] set kINT8 at layer[151]Conv_157[Conv] set kINT8 at layer[152]Relu_158[Relu] set kINT8 at layer[153]Conv_159[Conv] set kINT8 at layer[154]Relu_160[Relu] set kINT8 at layer[155]Conv_161[Conv] set kINT8 at layer[156]Relu_162[Relu] set kINT8 at layer[157]Conv_163[Conv] set kINT8 at layer[158]Relu_164[Relu] set kINT8 at layer[159]Conv_165[Conv] set kINT8 at layer[160]Conv_166[Conv] set kINT8 at layer[161]Relu_167[Relu] set kINT8 at layer[162]Conv_168[Conv] set kINT8 at layer[163]Relu_169[Relu] set kINT8 at layer[164]Conv_170[Conv] set kINT8 at layer[165]Conv_171[Conv] set kINT8 at layer[166]Relu_172[Relu] set kINT8 at layer[167]Conv_173[Conv] set kINT8 at layer[168]Relu_174[Relu] set kINT8 at layer[169]Conv_175[Conv] set kINT8 at layer[170]Relu_176[Relu] set kINT8 at layer[171]Conv_177[Conv] set kINT8 at layer[172]Relu_178[Relu] set kINT8 at layer[173]Conv_179[Conv] set kINT8 at layer[174]Conv_180[Conv] set kINT8 at layer[175]Relu_181[Relu] set kINT8 at layer[176]Conv_182[Conv] set kINT8 at layer[177]Relu_183[Relu] set kINT8 at layer[178]Conv_184[Conv] set kINT8 at layer[179]Conv_185[Conv] set kINT8 at layer[180]Relu_186[Relu] set kINT8 at layer[181]Conv_187[Conv] set kINT8 at layer[182]Relu_188[Relu] set kINT8 at layer[183]Conv_189[Conv] set kINT8 at layer[184]Conv_190[Conv] set kINT8 at layer[185]Relu_191[Relu] set kINT8 at layer[186]Conv_192[Conv] set kINT8 at layer[187]Relu_193[Relu] set kINT8 at layer[188]Conv_194[Conv] set kINT8 at layer[189]Conv_195[Conv] set kINT8 at layer[190]Relu_196[Relu] set kINT8 at layer[191]Conv_197[Conv] set kINT8 at layer[192]Relu_198[Relu] set kINT8 at layer[193]Conv_199[Conv] set kINT8 at layer[194]Concat_200[Concat] [2022-11-08 06:01:47 WARNING] Half2 support requested on hardware without native FP16 support, performance will be negatively affected. [2022-11-08 06:01:47 WARNING] Convolution + generic activation fusion is disable due to incompatible driver or nvrtc [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_0[Conv] + Relu_1[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_2[Conv] + Relu_3[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_4[Conv] + Relu_5[Relu] || Conv_13[Conv] + Relu_14[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_6[Conv] + Relu_7[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_8[Conv] + Relu_9[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Add_10[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_11[Conv] + Relu_12[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_16[Conv] + Relu_17[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_18[Conv] + Relu_19[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_20[Conv] + Add_21[ElementWise1] + Relu_22[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_23[Conv] + Relu_24[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_25[Conv] + Add_26[ElementWise1] + Relu_27[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_28[Conv] + Relu_29[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_30[Conv] + Add_31[ElementWise1] + Relu_32[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_33[Conv] + Relu_34[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_36[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_35[Conv] + Add_37[ElementWise1] + Relu_38[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_39[Conv] + Relu_40[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_41[Conv] + Add_42[ElementWise1] + Relu_43[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_44[Conv] + Relu_45[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_46[Conv] + Add_47[ElementWise1] + Relu_48[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_49[Conv] + Relu_50[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_51[Conv] + Add_52[ElementWise1] + Relu_53[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_54[Conv] + Relu_55[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_57[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_56[Conv] + Add_58[ElementWise1] + Relu_59[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_60[Conv] + Relu_61[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_62[Conv] + Add_63[ElementWise1] + Relu_64[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_65[Conv] + Relu_66[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_67[Conv] + Add_68[ElementWise1] + Relu_69[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_70[Conv] + Relu_71[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:47 WARNING] No implementation of layer Conv_72[Conv] + Add_73[ElementWise1] + Relu_74[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_75[Conv] + Relu_76[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_77[Conv] + Add_78[ElementWise1] + Relu_79[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_80[Conv] + Relu_81[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_82[Conv] + Add_83[ElementWise1] + Relu_84[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_85[Conv] + Relu_86[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_88[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_87[Conv] + Add_89[ElementWise1] + Relu_90[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_91[Conv] + Relu_92[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_93[Conv] + Add_94[ElementWise1] + Relu_95[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_96[Conv] + Relu_97[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_98[Conv] + Add_99[ElementWise1] + Relu_100[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 101) [Shuffle] + Reshape_103[Reshape] + (Unnamed Layer 103) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_106[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 105) [Shuffle] + Reshape_109[Reshape] + (Unnamed Layer 107) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_112[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer (Unnamed Layer 109) [Shuffle] + Reshape_115[Reshape] + (Unnamed Layer 111) [Shuffle] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Reshape_118[Reshape] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_119[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_120[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_122[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_123[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Upsample_124[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_125[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_126[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_128[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_129[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Add_130[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Upsample_131[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_132[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_133[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_135[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_136[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Add_137[ElementWise1] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_138[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Split_139[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer PWN(Sigmoid_141[Sigmoid]) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer DCNv2_142[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_143[Conv] + Relu_144[Relu] || Conv_152[Conv] + Relu_153[Relu] || Conv_157[Conv] + Relu_158[Relu] || Conv_166[Conv] + Relu_167[Relu] || Conv_171[Conv] + Relu_172[Relu] || Conv_180[Conv] + Relu_181[Relu] || Conv_185[Conv] + Relu_186[Relu] || Conv_190[Conv] + Relu_191[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_145[Conv] + Relu_146[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_147[Conv] + Relu_148[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_149[Conv] + Relu_150[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_151[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_154[Conv] + Relu_155[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_156[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_159[Conv] + Relu_160[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_161[Conv] + Relu_162[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_163[Conv] + Relu_164[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_165[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_168[Conv] + Relu_169[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_170[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_173[Conv] + Relu_174[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_175[Conv] + Relu_176[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_177[Conv] + Relu_178[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_179[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_182[Conv] + Relu_183[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_184[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_187[Conv] + Relu_188[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_189[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_192[Conv] + Relu_193[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_194[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_195[Conv] + Relu_196[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_197[Conv] + Relu_198[Relu] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:01:48 WARNING] No implementation of layer Conv_199[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. getBatch1[0] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 getBatch1[1] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 getBatch1[2] nbBindings=1 input.1 modelType=4 ReadBinFile to inputStream size=2334720 Bytes Read size 2334720 vs Need size 2334720 [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 729, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 732, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 735, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 738, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 741, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 744, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 747, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 469, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 750, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 753, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 756, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 478, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 759, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 762, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 485, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 765, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 768, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 492, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 771, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 774, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 501, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 780, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 783, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 508, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 786, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 789, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 515, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 792, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 795, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 522, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 798, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 801, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 531, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 807, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 810, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 538, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 813, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 816, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 545, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 819, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 822, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 552, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 825, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 828, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 559, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 831, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 834, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 566, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 837, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 840, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 575, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 846, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 849, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 582, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 852, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 855, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 589, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 591, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 593, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 597, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 599, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 603, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 605, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 613, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 624, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 636, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 644, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 858, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 861, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 864, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 867, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 870, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 873, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 876, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 879, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 882, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 885, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 888, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 891, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 894, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 897, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 900, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 903, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 906, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 909, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 912, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 915, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 918, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 921, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 924, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 927, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Missing scale and zero-point for tensor 728, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [2022-11-08 06:01:58 WARNING] Detected invalid timing cache, setup a local cache instead [2022-11-08 06:02:05 WARNING] No implementation of layer Split_120[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer DCNv2_123[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Upsample_124[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Split_126[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer DCNv2_129[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Upsample_131[Plugin.Upsample] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:05 WARNING] No implementation of layer Split_133[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer DCNv2_136[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer Split_139[SplitPlugin] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:06 WARNING] No implementation of layer DCNv2_142[DCNv2] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_151[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_156[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_165[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_170[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_179[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_184[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_189[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_194[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation of layer Conv_199[Conv] obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [2022-11-08 06:02:07 WARNING] No implementation obeys reformatting-free rules, at least 1 reformatting nodes are needed, now picking the fastest path instead. [2022-11-08 06:02:07 ERROR] 2: [pluginV2Runner.cpp::getInputHostScale::88] Error Code 2: Internal Error (Assertion scales.size() == 1 failed.) terminate called after throwing an instance of 'std::runtime_error' what(): Failed to create object
Environment
Load onnx2trt lib V0.9.0 built@Nov 1 2022 07:13:17 DebugLevel=1 GPU Quadro P2200 @ 1.493 GHz, Compute Capability 6.1 CUDA MemInfo total= 5301469184Bytes, free= 4391763968Bytes, Delta= 0Bytes