PaddlePaddle / Paddle-Lite

PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
https://www.paddlepaddle.org.cn/lite
Apache License 2.0
6.96k stars 1.61k forks source link

mobilenetv1_ssd 800x800 int8量化模型Arm cpu后端在rk3568执行出core,在rk3399pro、jetson等设备正常执行 #9873

Closed markluofd closed 9 months ago

markluofd commented 1 year ago

Paddlelite版本:paddlelite2.12,基于b150c13 commitid 编译 运行设备: 运行会core dump的设备:RK3568 4xA55 armv8.2 运行正常的设备:RK3399Pro、Jetson等,都是armv8.0的cpu设备 后端信息:Arm CPU运行paddleliteopt转化后int8模型

预测信息: 1)预测基于c++ API 2)预测基于Armv8,4线程 3)预测库是手动编译,编译指令如下: LITE_BUILD_THREADS=52 ./lite/tools/build_linux.sh --arch=armv7 --with_python=OFF --with_extra=ON --with_log=ON full_publish

问题描述: 运行mobilenetv1_ssd 800x800 int8模型在RK3568上出core,dump信息如下 22f6b199f16514aa94bced956b203889 看到是packb_sdot_int8_n12_n8_n4函数出core,源码里有大量汇编代码,怀疑是汇编代码不适用armv8.2,只适用armv8.0

模型: 这里是opt之后的模型,以及相应的前处理信息、label_list和测试图片 model.zip

paddle-bot[bot] commented 1 year ago

您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看官网文档常见问题历史Issue来寻求解答。祝您生活愉快~

Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the APIFAQ and Github Issue to get the answer.Have a nice day!