Closed NingNanXin closed 1 year ago
@NingNanXin I have the same problem. Did you solve this?
@NingNanXin I have the same problem. Did you solve this?
Debug your preprocess and post-process, use torch.onnx.export()
can get the right onnx model.
@NingNanXin Thanks for your reply. Can you explain more detail? How did you solve that?
@ducnguyen998
numpy.testing.assert_allclose
to compare torch result and onnx result
For me, 3 is right, then compare your process code with the origin code@NingNanXin Thank you. I have my own model that produces accurate results when run in PyTorch format. However, when I convert it to ONNX format, the model gives incorrect results. I attempted to convert the author's pre-trained model to ONNX, and it successfully produced the same results as the PyTorch model. However, when I applied the same process to my own model, it did not work as expected.
Hi, Thanks for your work. Recently I retrain U2net for my task and export onnx. The export code as follows:
After successfully export onnx, I try to infer an image with onnx. The preprocess code as follows, but the result is wrong.
Thanks for your help!
save_result func, I comment the "255". Speechless, I am so, emmm.