PaddlePaddle / Paddle-Lite

PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
https://www.paddlepaddle.org.cn/lite
Apache License 2.0
6.93k stars 1.61k forks source link

mobileNetv3 onnx转nb(fp16)后,推理输出为nan #10402

Open fengyanWang opened 11 months ago

fengyanWang commented 11 months ago

test_file.zip

fengyanWang commented 11 months ago

没有人管issue了吗?都5天了

hong19860320 commented 6 months ago

可以先转成FP32 模型,看精度是否 OK,如果 OK,FP16 是 nan,要么是FP16 算子 BUG,要么就是 FP16 结果溢出了。