Tencent / ncnn

ncnn is a high-performance neural network inference framework optimized for the mobile platform
Other
20.44k stars 4.16k forks source link

The output between onnx and ncnn is different. #2596

Open alswlsghd320 opened 3 years ago

alswlsghd320 commented 3 years ago

I converted torch model to ncnn model. (.pt => onnx => onnxsim => .bin, .param) and it worked successfully .

And I'm trying to test the output ncnn::Mat between onnx and ncnn, even torch using my custom ENet model.

I checked that

  1. Both .onnx and .param model have same weight (even .pt)
  2. The output b/w onnx and torch is the same.

But, ncnn model has the totally different result compared to onnx or torch.

So, I checked the second convolution layer's output, and the output value has changed since the third or fourth place of decimal.

How to solve this problem? I'll send my custom onnx, onnxsim, ncnn model files. Please give me your address.

Thanks for help.

nihui commented 3 years ago

Hi

please check the FAQ list https://github.com/Tencent/ncnn/wiki/FAQ-ncnn-produce-wrong-result

DeepvisionsKorea commented 3 years ago

Thanks for reply.

I already checked the above FAQ list. I trained my model using torch, checked input is RGB using ncnn::Mat in = ncnn::Mat::from_pixels(m.data, ncnn::Mat::PIXEL_BGR2RGB, m.cols, m.rows.

I tested it in order of BGR on purpose and used .jpg, .bmp, .png image files. But, the result is different from original torch model's output.

Is there any other way to solve?

Thanks for your help.

nihui commented 3 months ago

针对onnx模型转换的各种问题,推荐使用最新的pnnx工具转换到ncnn In view of various problems in onnx model conversion, it is recommended to use the latest pnnx tool to convert your model to ncnn

pip install pnnx
pnnx model.onnx inputshape=[1,3,224,224]

详细参考文档 Detailed reference documentation https://github.com/pnnx/pnnx https://github.com/Tencent/ncnn/wiki/use-ncnn-with-pytorch-or-onnx#how-to-use-pnnx