PaddlePaddle / FastDeploy

⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
https://www.paddlepaddle.org.cn/fastdeploy
Apache License 2.0
3k stars 465 forks source link

Jetson nano 编译报错问题 #1291

Closed monkeycc closed 2 weeks ago

monkeycc commented 1 year ago

环境

python3设置为默认的python

sudo rm /usr/bin/python
sudo ln -s /usr/bin/python3.6 /usr/bin/python

编译 PADDLEINFERENCE_DIRECTORY 这里有问题

Paddle Inference预编译库 有对应的python版本 Jetpack4.6:nv_jetson-cuda10.2-trt8.0-nano 我直接安装了

但是这里还要写一个PADDLEINFERENCE_DIRECTORY目录 我下载了选择对应的Jetpack C++包 作为这个目录 C++ Jetson(Nano) Jetpack 4.6.1 paddle_inference_install_dir 不知道对不对

那么我这里 是不是就不需要安装python预编译库 还是安装了python预编译库 这里就不需要了? 还是这里安装 不要安装python预编译库? 或者安装了python预编译库 PADDLEINFERENCE_DIRECTORY就不用写了?


git clone https://github.com/PaddlePaddle/FastDeploy.git
cd FastDeploy/python
export BUILD_ON_JETSON=ON
export ENABLE_VISION=ON

# ENABLE_PADDLE_BACKEND & PADDLEINFERENCE_DIRECTORY为可选项
export ENABLE_PADDLE_BACKEND=ON
export PADDLEINFERENCE_DIRECTORY=/Download/paddle_inference_install_dir

python setup.py build

编译报错 jetpack 4.6.1 Python 3.6.9 cmake 3.10.2 gcc 7.5.0


~/FastDeploy/python$ python setup.py build
fatal: not a git repository (or any of the parent directories): .git
running build
running build_py
running create_version
running cmake_build
-- The C compiler identification is GNU 7.5.0
-- The CXX compiler identification is GNU 7.5.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
Downloading file from https://bj.bcebos.com/fastdeploy/third_libs/patchelf-0.15.0-aarch64.tar.gz to /home/MM/FastDeploy/python/.setuptools-cmake-build/patchelf-0.15.0-aarch64.tar.gz ...
-- [download 5% complete]
-- [download 10% complete]
-- [download 15% complete]
-- [download 20% complete]
-- [download 26% complete]
-- [download 31% complete]
-- [download 36% complete]
-- [download 41% complete]
-- [download 46% complete]
-- [download 51% complete]
-- [download 56% complete]
-- [download 61% complete]
-- [download 67% complete]
-- [download 72% complete]
-- [download 77% complete]
-- [download 82% complete]
-- [download 83% complete]
-- [download 88% complete]
-- [download 93% complete]
-- [download 98% complete]
-- [download 100% complete]
Decompress file /home/MM/FastDeploy/python/.setuptools-cmake-build/patchelf-0.15.0-aarch64.tar.gz ...
-- Use the default onnxruntime lib. The ONNXRuntime path: /home/MM/FastDeploy/python/.setuptools-cmake-build/third_libs/install/onnxruntime
Cannot compile with onnxruntime-gpu while in linux-aarch64 platform, fallback to onnxruntime-cpu
CMake Error at cmake/paddle_inference.cmake:72 (find_package):
  By not providing "FindPython.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "Python", but
  CMake did not find one.

  Could not find a package configuration file provided by "Python" with any
  of the following names:

    PythonConfig.cmake
    python-config.cmake

  Add the installation prefix of "Python" to CMAKE_PREFIX_PATH or set
  "Python_DIR" to a directory containing one of the above files.  If "Python"
  provides a separate development package or SDK, be sure it has been
  installed.
Call Stack (most recent call first):
  CMakeLists.txt:225 (include)

-- Configuring incomplete, errors occurred!
See also "/home/MM/FastDeploy/python/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log".
Traceback (most recent call last):
  File "setup.py", line 437, in <module>
    license='Apache 2.0')
  File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 129, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib/python3.6/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/usr/lib/python3.6/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/usr/lib/python3.6/distutils/command/build.py", line 135, in run
    self.run_command(cmd_name)
  File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "setup.py", line 280, in run
    self.run_command('cmake_build')
  File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "setup.py", line 266, in run
    subprocess.check_call(cmake_args)
  File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/cmake', '-DPYTHON_INCLUDE_DIR=/usr/include/python3.6m', '-DPYTHON_EXECUTABLE=/usr/bin/python', '-DBUILD_FASTDEPLOY_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=paddle2onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DENABLE_RKNPU2_BACKEND=OFF', '-DENABLE_SOPHGO_BACKEND=OFF', '-DWITH_ASCEND=OFF', '-DENABLE_ORT_BACKEND=OFF', '-DENABLE_OPENVINO_BACKEND=OFF', '-DENABLE_PADDLE_BACKEND=ON', '-DENABLE_POROS_BACKEND=OFF', '-DENABLE_TRT_BACKEND=OFF', '-DENABLE_LITE_BACKEND=OFF', '-DPADDLELITE_URL=OFF', '-DENABLE_VISION=ON', '-DENABLE_ENCRYPTION=OFF', '-DENABLE_FLYCV=OFF', '-DENABLE_CVCUDA=OFF', '-DENABLE_TEXT=OFF', '-DENABLE_BENCHMARK=OFF', '-DWITH_GPU=OFF', '-DWITH_IPU=OFF', '-DWITH_KUNLUNXIN=OFF', '-DBUILD_ON_JETSON=ON', '-DTRT_DIRECTORY=UNDEFINED', '-DCUDA_DIRECTORY=/usr/local/cuda', '-DLIBRARY_NAME=fastdeploy', '-DPY_LIBRARY_NAME=fastdeploy_main', '-DOPENCV_DIRECTORY=', '-DORT_DIRECTORY=', '-DPADDLEINFERENCE_DIRECTORY=/home/MM/FastDeploy/paddle_inference_jetson', '-DRKNN2_TARGET_SOC=', '/home/MM/FastDeploy']' returned non-zero exit status 1.
jiangjiajun commented 1 year ago

@monkeycc 尝试升级cmake后试下

下载此文件 https://github.com/Kitware/CMake/releases/download/v3.25.2/cmake-3.25.2-linux-aarch64.tar.gz 解压后,执行如下命令

export PATH=/Path/To/cmake-3.25.2/bin:${PATH}
cmake --version

确认cmake版本为3.25后,再重新编译FastDeploy。

这里需要注意一下,上面是通过export环境变量,修改cmake版本,仅在当前打开的终端有效。如需要在系统中彻底升级cmake,可将此环境变量添加到bashrc中

zachary-zheng commented 1 year ago

jetson nx jetpack 4.6.1 Python 3.6 升级cmake后 上面的问题解决了,但是出现如下问题: -- The CUDA compiler identification is unknown CMake Error at /home/cji-loto01/fast_deploy/cmake-3.25.2-linux-aarch64/share/cmake-3.25/Modules/CMakeDetermineCUDACompiler.cmake:603 (message): Failed to detect a default CUDA architecture.

@jiangjiajun 麻烦请看一下

jiangjiajun commented 1 year ago

@wang-xinyu

wang-xinyu commented 1 year ago

@monkeycc @zachary-zheng 可以用默认的cmake3.10, 不要把python3设置为默认python,编译的时候指定python3(python3 setup.py build

qiulongquan commented 1 year ago

@wang-xinyu 在jetson nano 上面使用python编译的时候也发生了类似的错误,我使用python 和 python3 都有相同的错误发生 jetpack 4.6.3 Python 3.6.15 cmake 3.10.2 gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) jetson nano B01 4G

(paddle3.6) qiulongquan@qiulongquan-desktop:~/ml/FastDeploy/python$ python3 setup.py build running build running build_py running create_version running cmake_build Decompress file /home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/patchelf-0.15.0-aarch64.tar.gz ... -- Use the default onnxruntime lib. The ONNXRuntime path: /home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/third_libs/install/onnxruntime Cannot compile with onnxruntime-gpu while in linux-aarch64 platform, fallback to onnxruntime-cpu CMake Error at cmake/paddle_inference.cmake:72 (find_package): By not providing "FindPython.cmake" in CMAKE_MODULE_PATH this project has asked CMake to find a package configuration file provided by "Python", but CMake did not find one.

Could not find a package configuration file provided by "Python" with any of the following names:

PythonConfig.cmake
python-config.cmake

Add the installation prefix of "Python" to CMAKE_PREFIX_PATH or set "Python_DIR" to a directory containing one of the above files. If "Python" provides a separate development package or SDK, be sure it has been installed. Call Stack (most recent call first): CMakeLists.txt:225 (include)

-- Configuring incomplete, errors occurred! See also "/home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log". Traceback (most recent call last): File "setup.py", line 437, in license='Apache 2.0') File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/site-packages/setuptools/init.py", line 153, in setup return distutils.core.setup(**attrs) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/core.py", line 148, in setup dist.run_commands() File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 955, in run_commands self.run_command(cmd) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/command/build.py", line 135, in run self.run_command(cmd_name) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "setup.py", line 280, in run self.run_command('cmake_build') File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "setup.py", line 266, in run subprocess.check_call(cmake_args) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/subprocess.py", line 311, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/usr/bin/cmake', '-DPYTHON_INCLUDE_DIR=/home/qiulongquan/mambaforge/envs/paddle3.6/include/python3.6m', '-DPYTHON_EXECUTABLE=/home/qiulongquan/mambaforge/envs/paddle3.6/bin/python3', '-DBUILD_FASTDEPLOY_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=paddle2onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DENABLE_RKNPU2_BACKEND=OFF', '-DENABLE_SOPHGO_BACKEND=OFF', '-DWITH_ASCEND=OFF', '-DENABLE_ORT_BACKEND=OFF', '-DENABLE_OPENVINO_BACKEND=OFF', '-DENABLE_PADDLE_BACKEND=ON', '-DENABLE_POROS_BACKEND=OFF', '-DENABLE_TRT_BACKEND=OFF', '-DENABLE_LITE_BACKEND=OFF', '-DPADDLELITE_URL=OFF', '-DENABLE_VISION=ON', '-DENABLE_ENCRYPTION=OFF', '-DENABLE_FLYCV=OFF', '-DENABLE_CVCUDA=OFF', '-DENABLE_TEXT=OFF', '-DENABLE_BENCHMARK=OFF', '-DWITH_GPU=OFF', '-DWITH_IPU=OFF', '-DWITH_KUNLUNXIN=OFF', '-DBUILD_ON_JETSON=ON', '-DTRT_DIRECTORY=UNDEFINED', '-DCUDA_DIRECTORY=/usr/local/cuda', '-DLIBRARY_NAME=fastdeploy', '-DPY_LIBRARY_NAME=fastdeploy_main', '-DOPENCV_DIRECTORY=', '-DORT_DIRECTORY=', '-DPADDLEINFERENCE_DIRECTORY=/Download/paddle_inference_jetson', '-DRKNN2_TARGET_SOC=', '/home/qiulongquan/ml/FastDeploy']' returned non-zero exit status 1.

jiangjiajun commented 1 year ago

看起来是没有找到python,根据自己的python路径,执行一下下面的环境变量设置试下

export LD_LIBRARY_PATH=/opt/_internal/cpython-3.7.0/lib/:${LD_LIBRARY_PATH}
export PATH=/opt/_internal/cpython-3.7.0/bin/:${PATH}
export PYTHON_FLAGS="-DPYTHON_EXECUTABLE:FILEPATH=/opt/_internal/cpython-3.7.0/bin/python3.7
        -DPYTHON_INCLUDE_DIR:PATH=/opt/_internal/cpython-3.7.0/include/python3.7m
        -DPYTHON_LIBRARIES:FILEPATH=/opt/_internal/cpython-3.7.0/lib/libpython3.so"
wang-xinyu commented 1 year ago

看起来是没有找到python,根据自己的python路径,执行一下下面的环境变量设置试下

export LD_LIBRARY_PATH=/opt/_internal/cpython-3.7.0/lib/:${LD_LIBRARY_PATH}
export PATH=/opt/_internal/cpython-3.7.0/bin/:${PATH}
export PYTHON_FLAGS="-DPYTHON_EXECUTABLE:FILEPATH=/opt/_internal/cpython-3.7.0/bin/python3.7
        -DPYTHON_INCLUDE_DIR:PATH=/opt/_internal/cpython-3.7.0/include/python3.7m
        -DPYTHON_LIBRARIES:FILEPATH=/opt/_internal/cpython-3.7.0/lib/libpython3.so"

或者也可以试下sudo apt install libpython-dev libpython3-dev

qiulongquan commented 1 year ago

看起来是没有找到python,根据自己的python路径,执行一下下面的环境变量设置试下

export LD_LIBRARY_PATH=/opt/_internal/cpython-3.7.0/lib/:${LD_LIBRARY_PATH}
export PATH=/opt/_internal/cpython-3.7.0/bin/:${PATH}
export PYTHON_FLAGS="-DPYTHON_EXECUTABLE:FILEPATH=/opt/_internal/cpython-3.7.0/bin/python3.7
        -DPYTHON_INCLUDE_DIR:PATH=/opt/_internal/cpython-3.7.0/include/python3.7m
        -DPYTHON_LIBRARIES:FILEPATH=/opt/_internal/cpython-3.7.0/lib/libpython3.so"

或者也可以试下sudo apt install libpython-dev libpython3-dev

感谢 你提供的方法是正确的 python路径无法找到的问题, 通过export 指定python路径可以正常编译了 谢谢

qiulongquan commented 1 year ago

抱歉 上面的编译文件解决后 我进行测试 发现无法使用--use_trt True GPU模式 正常出结果 但是 加上--use_trt True 参数就有错误

jetpack 4.6.3 Python 3.6.15 cmake 3.10.2 gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) jetson nano B01 4G cudnn 8.2.1.32 测试预测命令 python infer.py --model Pytorch_RetinaFace_mobile0.25-640-640.onnx --image test_lite_face_detector_3.jpg --device gpu --use_trt True

(paddle3.6) qiulongquan@qiulongquan-desktop:~/ml/FastDeploy/examples/vision/facedet/retinaface/python$ python infer.py --model Pytorch_RetinaFace_mobile0.25-640-640.onnx --image test_lite_face_detector_3.jpg --device gpu --use_trt True [INFO] fastdeploy/backends/tensorrt/trt_backend.cc(416)::BuildTrtEngine Start to building TensorRT Engine... [ERROR] fastdeploy/backends/tensorrt/trt_backend.cc(228)::log 1: [executionResources.cpp::setTacticSources::156] Error Code 1: Cudnn (Could not initialize cudnn, please check cudnn installation.) [ERROR] fastdeploy/backends/tensorrt/trt_backend.cc(228)::log 2: [builder.cpp::buildSerializedNetwork::609] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed. ) [ERROR] fastdeploy/backends/tensorrt/trt_backend.cc(465)::BuildTrtEngine Failed to call buildSerializedNetwork(). [ERROR] fastdeploy/backends/tensorrt/trt_backend.cc(572)::CreateTrtEngineFromOnnx Failed to build tensorrt engine. [INFO] fastdeploy/runtime.cc(289)::Init Runtime initialized with Backend::TRT in Device::GPU. Segmentation fault (core dumped)

image

jiangjiajun commented 1 year ago

这个问题应该是出现在tensorrt对这个onnx模型,存在某些算子还没支持的原因。 可以直接调用jetson的trtexec直接试下将这个onnx模型转成trt文件,他会打印出为什么不能转成trt的日志。

同时可以尝试使用onnxsim对模型进行优化后,再加载。

qiulongquan commented 1 year ago

好像不是算子不支持的问题,我在另外一台PC 上面执行相同的 infer.py 带trt=true 可以出结果。看错误信息 好像是cudnn没有找到的问题 onnxsim 这个我试过了,简化后的onnx模型还是同样的错误

jiangjiajun commented 1 year ago

PC和jetson上, tensorrt的版本不同。 我看你这里的jetpack版本是4.6, 它的trt版本应该是8.0, 是比较低的版本

qiulongquan commented 1 year ago

PC和jetson上, tensorrt的版本不同。 我看你这里的jetpack版本是4.6, 它的trt版本应该是8.0, 是比较低的版本

我是按照官网retinaface的介绍进行测试,这个模型不能trt吗? 上面写的有trt的infer方式啊 https://github.com/PaddlePaddle/FastDeploy/blob/develop/examples/vision/facedet/retinaface/python/README_CN.md

zachary-zheng commented 1 year ago

看起来是没有找到python,根据自己的python路径,执行一下下面的环境变量设置试下

export LD_LIBRARY_PATH=/opt/_internal/cpython-3.7.0/lib/:${LD_LIBRARY_PATH}
export PATH=/opt/_internal/cpython-3.7.0/bin/:${PATH}
export PYTHON_FLAGS="-DPYTHON_EXECUTABLE:FILEPATH=/opt/_internal/cpython-3.7.0/bin/python3.7
        -DPYTHON_INCLUDE_DIR:PATH=/opt/_internal/cpython-3.7.0/include/python3.7m
        -DPYTHON_LIBRARIES:FILEPATH=/opt/_internal/cpython-3.7.0/lib/libpython3.so"

或者也可以试下sudo apt install libpython-dev libpython3-dev

Jetson NX 上面设置是针对python3.7的,默认安装3.6,查看/opt/文件夹下面没有_internal/cpython-3.7.0...文件夹,而且已经安裝了 libpython-dev libpython3-dev仍然报错:

@wang-xinyu 在jetson nano 上面使用python编译的时候也发生了类似的错误,我使用python 和 python3 都有相同的错误发生 jetpack 4.6.3 Python 3.6.15 cmake 3.10.2 gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) jetson nano B01 4G

(paddle3.6) qiulongquan@qiulongquan-desktop:~/ml/FastDeploy/python$ python3 setup.py build running build running build_py running create_version running cmake_build Decompress file /home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/patchelf-0.15.0-aarch64.tar.gz ... -- Use the default onnxruntime lib. The ONNXRuntime path: /home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/third_libs/install/onnxruntime Cannot compile with onnxruntime-gpu while in linux-aarch64 platform, fallback to onnxruntime-cpu CMake Error at cmake/paddle_inference.cmake:72 (find_package): By not providing "FindPython.cmake" in CMAKE_MODULE_PATH this project has asked CMake to find a package configuration file provided by "Python", but CMake did not find one.

Could not find a package configuration file provided by "Python" with any of the following names:

PythonConfig.cmake
python-config.cmake

Add the installation prefix of "Python" to CMAKE_PREFIX_PATH or set "Python_DIR" to a directory containing one of the above files. If "Python" provides a separate development package or SDK, be sure it has been installed. Call Stack (most recent call first): CMakeLists.txt:225 (include)

-- Configuring incomplete, errors occurred! See also "/home/qiulongquan/ml/FastDeploy/python/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log". Traceback (most recent call last): File "setup.py", line 437, in license='Apache 2.0') File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/site-packages/setuptools/init.py", line 153, in setup return distutils.core.setup(**attrs) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/core.py", line 148, in setup dist.run_commands() File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 955, in run_commands self.run_command(cmd) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/command/build.py", line 135, in run self.run_command(cmd_name) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "setup.py", line 280, in run self.run_command('cmake_build') File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/distutils/dist.py", line 974, in run_command cmd_obj.run() File "setup.py", line 266, in run subprocess.check_call(cmake_args) File "/home/qiulongquan/mambaforge/envs/paddle3.6/lib/python3.6/subprocess.py", line 311, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/usr/bin/cmake', '-DPYTHON_INCLUDE_DIR=/home/qiulongquan/mambaforge/envs/paddle3.6/include/python3.6m', '-DPYTHON_EXECUTABLE=/home/qiulongquan/mambaforge/envs/paddle3.6/bin/python3', '-DBUILD_FASTDEPLOY_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=paddle2onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DENABLE_RKNPU2_BACKEND=OFF', '-DENABLE_SOPHGO_BACKEND=OFF', '-DWITH_ASCEND=OFF', '-DENABLE_ORT_BACKEND=OFF', '-DENABLE_OPENVINO_BACKEND=OFF', '-DENABLE_PADDLE_BACKEND=ON', '-DENABLE_POROS_BACKEND=OFF', '-DENABLE_TRT_BACKEND=OFF', '-DENABLE_LITE_BACKEND=OFF', '-DPADDLELITE_URL=OFF', '-DENABLE_VISION=ON', '-DENABLE_ENCRYPTION=OFF', '-DENABLE_FLYCV=OFF', '-DENABLE_CVCUDA=OFF', '-DENABLE_TEXT=OFF', '-DENABLE_BENCHMARK=OFF', '-DWITH_GPU=OFF', '-DWITH_IPU=OFF', '-DWITH_KUNLUNXIN=OFF', '-DBUILD_ON_JETSON=ON', '-DTRT_DIRECTORY=UNDEFINED', '-DCUDA_DIRECTORY=/usr/local/cuda', '-DLIBRARY_NAME=fastdeploy', '-DPY_LIBRARY_NAME=fastdeploy_main', '-DOPENCV_DIRECTORY=', '-DORT_DIRECTORY=', '-DPADDLEINFERENCE_DIRECTORY=/Download/paddle_inference_jetson', '-DRKNN2_TARGET_SOC=', '/home/qiulongquan/ml/FastDeploy']' returned non-zero exit status 1.

求助,感谢!

827346462 commented 1 year ago

看起来是没有找到python,根据自己的python路径,执行一下下面的环境变量设置试下

export LD_LIBRARY_PATH=/opt/_internal/cpython-3.7.0/lib/:${LD_LIBRARY_PATH}
export PATH=/opt/_internal/cpython-3.7.0/bin/:${PATH}
export PYTHON_FLAGS="-DPYTHON_EXECUTABLE:FILEPATH=/opt/_internal/cpython-3.7.0/bin/python3.7
        -DPYTHON_INCLUDE_DIR:PATH=/opt/_internal/cpython-3.7.0/include/python3.7m
        -DPYTHON_LIBRARIES:FILEPATH=/opt/_internal/cpython-3.7.0/lib/libpython3.so"

或者也可以试下sudo apt install libpython-dev libpython3-dev

感谢 你提供的方法是正确的 python路径无法找到的问题, 通过export 指定python路径可以正常编译了 谢谢

请问怎么指定路径的呀,我指定了还是报错 image

827346462 commented 1 year ago

LD_LIBRARY_PATH

我是指定到conda 环境下的python环境

zachary-zheng commented 1 year ago

Jetson NX , 默认安装python3.6.9, 默认 cmake version 3.10.2 Jetson部署库编译 C++ SDK编译安装 报错: 2023-02-17 20-00-06屏幕截图 Jetson部署库编译 python 编译安装 报错: 图片

monkeycc commented 1 year ago

@zachary-zheng 默认 cmake version 3.10.2 不能安装 要升级 我的就是这货报错了

@jiangjiajun 官方教程应该改下 要求升级cmake-3.25.2

@monkeycc 尝试升级cmake后试下

下载此文件 https://github.com/Kitware/CMake/releases/download/v3.25.2/cmake-3.25.2-linux-aarch64.tar.gz 解压后,执行如下命令

export PATH=/Path/To/cmake-3.25.2/bin:${PATH}
cmake --version

确认cmake版本为3.25后,再重新编译FastDeploy。

这里需要注意一下,上面是通过export环境变量,修改cmake版本,仅在当前打开的终端有效。如需要在系统中彻底升级cmake,可将此环境变量添加到bashrc中

jiangjiajun commented 1 year ago

@zachary-zheng 试下@monkeycc 的方式升级cmake看下,如若可以解决我来更新下编译文档

zachary-zheng commented 1 year ago

用cmake3.25确实解决了python编译的错误👍,c++编译我还没有来得及测试,明天测试更新结果。

monkeycc commented 1 year ago

用cmake3.25确实解决了python编译的错误 GPU检测也正常 没问题 @jiangjiajun

zachary-zheng commented 1 year ago

@zachary-zheng 试下@monkeycc 的方式升级cmake看下,如若可以解决我来更新下编译文档

@monkeycc 尝试升级cmake后试下

下载此文件 https://github.com/Kitware/CMake/releases/download/v3.25.2/cmake-3.25.2-linux-aarch64.tar.gz 解压后,执行如下命令

export PATH=/Path/To/cmake-3.25.2/bin:${PATH}
cmake --version

确认cmake版本为3.25后,再重新编译FastDeploy。

这里需要注意一下,上面是通过export环境变量,修改cmake版本,仅在当前打开的终端有效。如需要在系统中彻底升级cmake,可将此环境变量添加到bashrc中

@jiangjiajun 亲测更新cmake 3.25后无论C++和python 都在Jetson NX成功编译,请更新编译文档,感谢jiajun. 这个Issue也 可以close了。

827346462 commented 1 year ago

image

我使用pip3 install fastdeploy_python-0.0.0-cp36-cp36m-linux_aarch64.whl

运行example 中的yolov5 会出现vision找不到的错误。 我直接安装pip 安装会安装失败,好像是因为默认opencv==4.7.0.68不支持python3.6 所以我手动安装了opencv==4.6.0.66

827346462 commented 1 year ago

C++ 编译是成功,运行example 也是没得问题的

827346462 commented 1 year ago

image

我使用pip3 install fastdeploy_python-0.0.0-cp36-cp36m-linux_aarch64.whl

运行example 中的yolov5 会出现vision找不到的错误。 我直接安装pip 安装会安装失败,好像是因为默认opencv==4.7.0.68不支持python3.6 所以我手动安装了opencv==4.6.0.66

我重新编译解决了问题。整个文档没得问题。opencv==4.6.0.66也是可以得

kewuyu commented 1 year ago

错误信息,我的cmake已经是3.2.5了 编译命令

cmake .. -DBUILD_ON_JETSON=ON \
>          -DENABLE_VISION=ON \
>          -DENABLE_PADDLE_BACKEND=ON \
>          -DPADDLEINFERENCE_DIRECTORY=/home/kewuyu/Downloads/paddle_inference_install_dir \
>          -DCMAKE_INSTALL_PREFIX=${PWD}/installed_fastdeploy

CMake Error at cmake/paddle_inference.cmake:271 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

CMake Error at cmake/paddle_inference.cmake:272 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

CMake Error at cmake/paddle_inference.cmake:273 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

-- The CUDA compiler identification is NVIDIA 10.2.300
-- Detecting CUDA compiler ABI info
-- Detecting CUDA compiler ABI info - done
-- Check for working CUDA compiler: /usr/local/cuda-10.2/bin/nvcc - skipped
-- Detecting CUDA compile features
-- Detecting CUDA compile features - done
-- CUDA compiler: /usr/local/cuda-10.2/bin/nvcc, version: NVIDIA 10.2.300
-- CUDA detected: 10.2.300
-- NVCC_FLAGS_EXTRA:  -gencode arch=compute_53,code=sm_53 -gencode arch=compute_62,code=sm_62 -gencode arch=compute_72,code=sm_72
-- Use the opencv lib specified by user. The OpenCV path: /usr/lib/aarch64-linux-gnu/cmake/opencv4/
-- Found OpenCV: /usr (found version "4.1.1") 
CMake Warning (dev) at /home/kewuyu/cmake-3.25.2-linux-aarch64/share/cmake-3.25/Modules/ExternalProject.cmake:3075 (message):
  The DOWNLOAD_EXTRACT_TIMESTAMP option was not given and policy CMP0135 is
  not set.  The policy's OLD behavior will be used.  When using a URL
  download, the timestamps of extracted files should preferably be that of
  the time of extraction, otherwise code that depends on the extracted
  contents might not be rebuilt if the URL changes.  The OLD behavior
  preserves the timestamps from the archive instead, but this is usually not
  what you want.  Update your project to the NEW behavior or specify the
  DOWNLOAD_EXTRACT_TIMESTAMP option with a value of true to avoid this
  robustness issue.
Call Stack (most recent call first):
  /home/kewuyu/cmake-3.25.2-linux-aarch64/share/cmake-3.25/Modules/ExternalProject.cmake:4185 (_ep_add_download_command)
  cmake/paddle2onnx.cmake:70 (ExternalProject_Add)
  CMakeLists.txt:478 (include)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- 
-- *************FastDeploy Building Summary**********
--   CMake version             : 3.25.2
--   CMake command             : /home/kewuyu/cmake-3.25.2-linux-aarch64/bin/cmake
--   System                    : Linux
--   C++ compiler              : /usr/bin/c++
--   C++ standard              : 11
--   C++ cuda standard         : 11
--   C++ compiler version      : 7.5.0
--   CXX flags                 : -Wno-format -g0 -O3
--   EXE linker flags          : 
--   Shared linker flags       : 
--   Build type                : 
--   Compile definitions       : _GLIBCXX_USE_CXX11_ABI=1;FASTDEPLOY_LIB;CMAKE_BUILD_TYPE=Release;ENABLE_ORT_BACKEND;ENABLE_PADDLE_BACKEND;WITH_GPU;ENABLE_TRT_BACKEND;ENABLE_VISION;ENABLE_PADDLE2ONNX
--   CMAKE_PREFIX_PATH         : 
--   CMAKE_INSTALL_PREFIX      : /home/kewuyu/FastDeploy/build/installed_fastdeploy
--   CMAKE_MODULE_PATH         : 
-- 
--   FastDeploy version        : 0.0.0
--   ENABLE_ORT_BACKEND        : ON
--   ENABLE_RKNPU2_BACKEND     : OFF
--   ENABLE_HORIZON_BACKEND    : OFF
--   ENABLE_SOPHGO_BACKEND     : OFF
--   ENABLE_PADDLE_BACKEND     : ON
--   ENABLE_LITE_BACKEND       : OFF
--   ENABLE_POROS_BACKEND      : OFF
--   ENABLE_TRT_BACKEND        : ON
--   ENABLE_OPENVINO_BACKEND   : OFF
--   ENABLE_TVM_BACKEND        : OFF
--   ENABLE_BENCHMARK          : OFF
--   ENABLE_VISION             : ON
--   ENABLE_TEXT               : OFF
--   ENABLE_ENCRYPTION         : OFF
--   ENABLE_FLYCV              : OFF
--   ENABLE_CVCUDA             : OFF
--   WITH_GPU                  : ON
--   WITH_IPU                  : OFF
--   WITH_OPENCL               : OFF
--   WITH_TESTING              : OFF
--   WITH_ASCEND               : OFF
--   WITH_DIRECTML             : OFF
--   WITH_TIMVX                : OFF
--   WITH_KUNLUNXIN            : OFF
--   WITH_CAPI                 : OFF
--   WITH_CSHARPAPI            : OFF
--   ONNXRuntime version       : 1.12.0
--   Paddle Inference version  : 
--   PADDLE_WITH_ENCRYPT       : OFF
--   PADDLE_WITH_AUTH          : OFF
--   CUDA_DIRECTORY            : /usr/local/cuda
--   TRT_DRECTORY              : 
-- Configuring incomplete, errors occurred!
See also "/home/kewuyu/FastDeploy/build/CMakeFiles/CMakeOutput.log".
kewuyu@kewuyu-desktop:~/FastDeploy/build$ cd ..
kewuyu@kewuyu-desktop:~/FastDeploy$ cmake --version
cmake version 3.25.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
kewuyu@kewuyu-desktop:~/FastDeploy$ ```
Petal99 commented 1 year ago

错误信息,我的cmake已经是3.2.5了 编译命令

cmake .. -DBUILD_ON_JETSON=ON \
>          -DENABLE_VISION=ON \
>          -DENABLE_PADDLE_BACKEND=ON \
>          -DPADDLEINFERENCE_DIRECTORY=/home/kewuyu/Downloads/paddle_inference_install_dir \
>          -DCMAKE_INSTALL_PREFIX=${PWD}/installed_fastdeploy
CMake Error at cmake/paddle_inference.cmake:271 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

CMake Error at cmake/paddle_inference.cmake:272 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

CMake Error at cmake/paddle_inference.cmake:273 (string):
  string sub-command REGEX, mode MATCH needs at least 5 arguments total to
  command.
Call Stack (most recent call first):
  CMakeLists.txt:245 (include)

-- The CUDA compiler identification is NVIDIA 10.2.300
-- Detecting CUDA compiler ABI info
-- Detecting CUDA compiler ABI info - done
-- Check for working CUDA compiler: /usr/local/cuda-10.2/bin/nvcc - skipped
-- Detecting CUDA compile features
-- Detecting CUDA compile features - done
-- CUDA compiler: /usr/local/cuda-10.2/bin/nvcc, version: NVIDIA 10.2.300
-- CUDA detected: 10.2.300
-- NVCC_FLAGS_EXTRA:  -gencode arch=compute_53,code=sm_53 -gencode arch=compute_62,code=sm_62 -gencode arch=compute_72,code=sm_72
-- Use the opencv lib specified by user. The OpenCV path: /usr/lib/aarch64-linux-gnu/cmake/opencv4/
-- Found OpenCV: /usr (found version "4.1.1") 
CMake Warning (dev) at /home/kewuyu/cmake-3.25.2-linux-aarch64/share/cmake-3.25/Modules/ExternalProject.cmake:3075 (message):
  The DOWNLOAD_EXTRACT_TIMESTAMP option was not given and policy CMP0135 is
  not set.  The policy's OLD behavior will be used.  When using a URL
  download, the timestamps of extracted files should preferably be that of
  the time of extraction, otherwise code that depends on the extracted
  contents might not be rebuilt if the URL changes.  The OLD behavior
  preserves the timestamps from the archive instead, but this is usually not
  what you want.  Update your project to the NEW behavior or specify the
  DOWNLOAD_EXTRACT_TIMESTAMP option with a value of true to avoid this
  robustness issue.
Call Stack (most recent call first):
  /home/kewuyu/cmake-3.25.2-linux-aarch64/share/cmake-3.25/Modules/ExternalProject.cmake:4185 (_ep_add_download_command)
  cmake/paddle2onnx.cmake:70 (ExternalProject_Add)
  CMakeLists.txt:478 (include)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- 
-- *************FastDeploy Building Summary**********
--   CMake version             : 3.25.2
--   CMake command             : /home/kewuyu/cmake-3.25.2-linux-aarch64/bin/cmake
--   System                    : Linux
--   C++ compiler              : /usr/bin/c++
--   C++ standard              : 11
--   C++ cuda standard         : 11
--   C++ compiler version      : 7.5.0
--   CXX flags                 : -Wno-format -g0 -O3
--   EXE linker flags          : 
--   Shared linker flags       : 
--   Build type                : 
--   Compile definitions       : _GLIBCXX_USE_CXX11_ABI=1;FASTDEPLOY_LIB;CMAKE_BUILD_TYPE=Release;ENABLE_ORT_BACKEND;ENABLE_PADDLE_BACKEND;WITH_GPU;ENABLE_TRT_BACKEND;ENABLE_VISION;ENABLE_PADDLE2ONNX
--   CMAKE_PREFIX_PATH         : 
--   CMAKE_INSTALL_PREFIX      : /home/kewuyu/FastDeploy/build/installed_fastdeploy
--   CMAKE_MODULE_PATH         : 
-- 
--   FastDeploy version        : 0.0.0
--   ENABLE_ORT_BACKEND        : ON
--   ENABLE_RKNPU2_BACKEND     : OFF
--   ENABLE_HORIZON_BACKEND    : OFF
--   ENABLE_SOPHGO_BACKEND     : OFF
--   ENABLE_PADDLE_BACKEND     : ON
--   ENABLE_LITE_BACKEND       : OFF
--   ENABLE_POROS_BACKEND      : OFF
--   ENABLE_TRT_BACKEND        : ON
--   ENABLE_OPENVINO_BACKEND   : OFF
--   ENABLE_TVM_BACKEND        : OFF
--   ENABLE_BENCHMARK          : OFF
--   ENABLE_VISION             : ON
--   ENABLE_TEXT               : OFF
--   ENABLE_ENCRYPTION         : OFF
--   ENABLE_FLYCV              : OFF
--   ENABLE_CVCUDA             : OFF
--   WITH_GPU                  : ON
--   WITH_IPU                  : OFF
--   WITH_OPENCL               : OFF
--   WITH_TESTING              : OFF
--   WITH_ASCEND               : OFF
--   WITH_DIRECTML             : OFF
--   WITH_TIMVX                : OFF
--   WITH_KUNLUNXIN            : OFF
--   WITH_CAPI                 : OFF
--   WITH_CSHARPAPI            : OFF
--   ONNXRuntime version       : 1.12.0
--   Paddle Inference version  : 
--   PADDLE_WITH_ENCRYPT       : OFF
--   PADDLE_WITH_AUTH          : OFF
--   CUDA_DIRECTORY            : /usr/local/cuda
--   TRT_DRECTORY              : 
-- Configuring incomplete, errors occurred!
See also "/home/kewuyu/FastDeploy/build/CMakeFiles/CMakeOutput.log".
kewuyu@kewuyu-desktop:~/FastDeploy/build$ cd ..
kewuyu@kewuyu-desktop:~/FastDeploy$ cmake --version
cmake version 3.25.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
kewuyu@kewuyu-desktop:~/FastDeploy$ ```

我也遇到了一样的问题,请问你解决了吗