Tencent / ncnn

ncnn is a high-performance neural network inference framework optimized for the mobile platform
Other
20.45k stars 4.16k forks source link

Hi3559A 推理结果不对 #1778

Open maxuehao opened 4 years ago

maxuehao commented 4 years ago

Hi,大佬们,同样的C++代码和模型,在PC端验证推理结果是正常的,为啥移植到海思平台跑结果就溢出啦,有谁遇到过吗?我交叉编译的方式就是根据文档编译的,模型onnx和caffe的mobilenet v2,两种方式编译的模型在PC端结果都对,就是板端有问题

maxuehao commented 4 years ago

mobilenetv2的移植有特殊要求吗?resnet就没有这个问题

xiakj commented 4 years ago

最近我也想往3559a上移植,请问这方向的坑多不?

maxuehao commented 4 years ago

@xiakj 板端移植没有坑,挺容易的

xiakj commented 4 years ago

@maxuehao 你为什么不用海思的NNIE?

maxuehao commented 4 years ago

@xiakj 我们NNIE已经跑了5个检测2个识别与1个跟踪,没有资源了

nihui commented 3 months ago

针对onnx模型转换的各种问题,推荐使用最新的pnnx工具转换到ncnn In view of various problems in onnx model conversion, it is recommended to use the latest pnnx tool to convert your model to ncnn

pip install pnnx
pnnx model.onnx inputshape=[1,3,224,224]

详细参考文档 Detailed reference documentation https://github.com/pnnx/pnnx https://github.com/Tencent/ncnn/wiki/use-ncnn-with-pytorch-or-onnx#how-to-use-pnnx