Open BaofengZan opened 6 months ago
hmmmm, I had not tested this though, this may affect some normalization but I am not sure. if no .eval() works then I recommend to just leave it so?
When I export onnx using torch.onnx.export, the bn layer exports in eval mode by default, If the onnxruntime inference is used, the result is incorrect. But when I export the bn layer in TRAINING mode, when converting to tensorrt, it will prompts that the BN layer is in TRAINING mode, which makes it impossible to convert. So I think I still need to check this issue.
Error when converting tensorrt。Official Answers: https://github.com/NVIDIA/TensorRT/issues/3457#issuecomment-1817477998
yeah sorry, that is indeed something we missed...
Thank you, I look forward to you being able to solve this problem.
I use the ‘Airline_demo.py’ to test the images and the original file can get normal results. But when I add Premodel.eval(), the result is 0(other parameters remain the same).
without model.eval()
with model.eval()