linghu8812 / tensorrt_inference

702 stars 206 forks source link

I have convert nano.trt success,but do infer can't get right result #62

Open chgit0214 opened 3 years ago

chgit0214 commented 3 years ago

please tell me how to solve this question? I need to check where ?

linghu8812 commented 3 years ago

is the onnx model structure same with https://github.com/RangiLyu/nanodet/issues/65#issue-759529500?

chgit0214 commented 3 years ago

yes

chgit0214 commented 3 years ago

my tensorrt=7.0.0.11,export onnx I have change opset=10.0

chgit0214 commented 3 years ago

I convert FP32 can get success right result,but convert fp16 can't get success right result

zhangyilalala commented 3 years ago

@chgit0214 I meet the same problem,have you solved this?

tomjeans commented 2 years ago

I use my customer data training but can`t get the right inference result! how to solve this?