PaddlePaddle / Paddle-Lite

PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
https://www.paddlepaddle.org.cn/lite
Apache License 2.0
6.89k stars 1.6k forks source link

自己编译的带verisilicon_timvx的whl文件,安装后运行mobilenetv1_full_api.py报错 #9391

Closed Aostas closed 5 months ago

Aostas commented 1 year ago

如果不用root安装或者标记成user的话会有警告

ERROR: Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/bin/paddle_lite_opt'
Consider using the `--user` option or check the permissions.
  1. Model is optimized and saved into opt_nnadapter.nb successfully [F 8/31 1:29:24.699 ...nadapter/nnadapter/src/runtime/device.cc:518 Find] Failed to load the nnadapter device HAL library for 'verisilicon_timvx' from libverisilicon_timvx.so, libtim-vx.so: cannot open shared object file: No such file or directory [F 8/31 1:29:24.699 ...nadapter/nnadapter/src/runtime/device.cc:518 Find] Failed to load the nnadapter device HAL library for 'verisilicon_timvx' from libverisilicon_timvx.so, libtim-vx.so: cannot open shared object file: No such file or directory

Traceback (most recent call last): File "mobilenetv1_full_api.py", line 242, in RunModel(args) File "mobilenetv1_full_api.py", line 218, in RunModel predictor.run() RuntimeError: NNAdapter C++ Exception: [F 8/31 1:29:24.699 ...nadapter/nnadapter/src/runtime/device.cc:518 Find] Failed to load the nnadapter device HAL library for 'verisilicon_timvx' from libverisilicon_timvx.so, libtim-vx.so: cannot open shared object file: No such file or directory

加上如下语句就可以使用:
```shell
export LD_LIBRARY_PATH=/usr/local/lib/python3.7/dist-packages/paddlelite/libs

像这种需要调用python包里的libs文件夹中的库文件,一般是打包wheel的时候处理好的,还是一定要自己去找到安装位置然后包含?

paddle-bot[bot] commented 1 year ago

您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看官网文档常见问题历史Issue来寻求解答。祝您生活愉快~

Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the APIFAQ and Github Issue to get the answer.Have a nice day!

hong19860320 commented 1 year ago

PaddleLite+TIM-VX 不支持 RK3399pro 哈(因为 RK 没有提供相应的驱动和 VSI SDK)~ 目前是支持RK1808、1126、1109,晶晨C308X、s905D3、A311D 和 NXP imx8plus 的,如果手上有这些芯片,可以直接尝试我们文档提供的一键 demo https://www.paddlepaddle.org.cn/lite/develop/demo_guides/verisilicon_timvx.html 哈 。

Aostas commented 1 year ago

PaddleLite+TIM-VX 不支持 RK3399pro 哈(因为 RK 没有提供相应的驱动和 VSI SDK)~ 目前是支持RK1808、1126、1109,晶晨C308X、s905D3、A311D 和 NXP imx8plus 的,如果手上有这些芯片,可以直接尝试我们文档提供的一键 demo https://www.paddlepaddle.org.cn/lite/develop/demo_guides/verisilicon_timvx.html 哈 。

只是借用3399编译,3399是host,我用的是1808

hong19860320 commented 1 year ago

这个是怎么用的呢?我记得之前 RK 同学说,如果要支持RK3399pro的 NPU,是需要相关驱动的,因为RK3399pro 是通过usb 总线将rk3399和rk1808芯片连接起来的,你是怎么访问1808芯片的呢?我们也希望把 rk3399pro的 NPU 用起来。

Aostas commented 1 year ago

这个是怎么用的呢?我记得之前 RK 同学说,如果要支持RK3399pro的 NPU,是需要相关驱动的,因为RK3399pro 是通过usb 总线将rk3399和rk1808芯片连接起来的,你是怎么访问1808芯片的呢?我们也希望把 rk3399pro的 NPU 用起来。

我没用rk3399的npu,只是找了个aarch64的平台编译(刚才用nx也能编译)生成aarch64的wheel(在rk1808上编译太慢而且内存不够),最终访问1808的也不是rk3399,我只是在rk3399上编译完成后吧whl文件通过pc传给rk1808