marcoslucianops / DeepStream-Yolo

NVIDIA DeepStream SDK 7.0 / 6.4 / 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 / 5.1 implementation for YOLO models
MIT License
1.39k stars 344 forks source link

How to install onnxsim on jetson xavier nx? #416

Open jhunmk29 opened 11 months ago

jhunmk29 commented 11 months ago

Hi, I read the tutorial 'YOLOv5 usage'. I input the cammand pip3 install onnxsim, but its wrong.

nvidia@nvidia-desktop:~$ pip3 install onnxsim
Defaulting to user installation because normal site-packages is not writeable
Collecting onnxsim
  Using cached onnxsim-0.4.13.tar.gz (18.1 MB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: onnx in ./.local/lib/python3.6/site-packages (from onnxsim) (1.4.1)
Requirement already satisfied: rich in ./.local/lib/python3.6/site-packages (from onnxsim) (12.6.0)
Requirement already satisfied: typing>=3.6.4 in ./.local/lib/python3.6/site-packages (from onnx->onnxsim) (3.7.4.3)
Requirement already satisfied: numpy in ./.local/lib/python3.6/site-packages (from onnx->onnxsim) (1.19.4)
Requirement already satisfied: six in /usr/lib/python3/dist-packages (from onnx->onnxsim) (1.11.0)
Requirement already satisfied: typing-extensions>=3.6.2.1 in ./.local/lib/python3.6/site-packages (from onnx->onnxsim) (4.1.1)
Requirement already satisfied: protobuf in ./.local/lib/python3.6/site-packages (from onnx->onnxsim) (3.19.6)
Requirement already satisfied: commonmark<0.10.0,>=0.9.0 in ./.local/lib/python3.6/site-packages (from rich->onnxsim) (0.9.1)
Requirement already satisfied: pygments<3.0.0,>=2.6.0 in ./.local/lib/python3.6/site-packages (from rich->onnxsim) (2.14.0)
Requirement already satisfied: dataclasses<0.9,>=0.7 in ./.local/lib/python3.6/site-packages (from rich->onnxsim) (0.8)
Building wheels for collected packages: onnxsim
  Building wheel for onnxsim (setup.py) ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-vb5n9rzq
       cwd: /tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/
  Complete output (86 lines):
  fatal: not a git repository (or any of the parent directories): .git
  fatal: not a git repository (or any of the parent directories): .git
  /home/nvidia/.local/lib/python3.6/site-packages/pkg_resources/__init__.py:119: PkgResourcesDeprecationWarning: 0.18ubuntu0.18.04.1 is an invalid version and will not be supported in a future release
    PkgResourcesDeprecationWarning,
  /home/nvidia/.local/lib/python3.6/site-packages/setuptools/installer.py:30: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer.
    SetuptoolsDeprecationWarning,
  running bdist_wheel
  running build
  running build_py
  running create_version
  creating build
  creating build/lib.linux-aarch64-3.6
  creating build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/version.py -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/model_info.py -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/__main__.py -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/model_checking.py -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/__init__.py -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/onnx_simplifier.py -> build/lib.linux-aarch64-3.6/onnxsim
  running egg_info
  writing onnxsim.egg-info/PKG-INFO
  writing dependency_links to onnxsim.egg-info/dependency_links.txt
  writing entry points to onnxsim.egg-info/entry_points.txt
  writing requirements to onnxsim.egg-info/requires.txt
  writing top-level names to onnxsim.egg-info/top_level.txt
  reading manifest file 'onnxsim.egg-info/SOURCES.txt'
  reading manifest template 'MANIFEST.in'
  warning: no files found matching '*.c' under directory 'onnxsim'
  warning: no files found matching '*.proto' under directory 'onnxsim'
  warning: no previously-included files matching '*' found under directory 'third_party/onnxruntime'
  warning: no previously-included files matching '*' found under directory 'third_party/onnx-optimizer/build'
  warning: no previously-included files matching '*' found under directory 'third_party/onnx/build'
  warning: no previously-included files matching '*' found under directory 'third_party/onnx/onnx/backend'
  adding license file 'LICENSE'
  writing manifest file 'onnxsim.egg-info/SOURCES.txt'
  copying onnxsim/cpp2py_export.cc -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/cxxopts.hpp -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/onnxsim.cpp -> build/lib.linux-aarch64-3.6/onnxsim
  copying onnxsim/onnxsim.h -> build/lib.linux-aarch64-3.6/onnxsim
  creating build/lib.linux-aarch64-3.6/onnxsim/bin
  copying onnxsim/bin/onnxsim_bin.cpp -> build/lib.linux-aarch64-3.6/onnxsim/bin
  copying onnxsim/bin/onnxsim_option.cpp -> build/lib.linux-aarch64-3.6/onnxsim/bin
  copying onnxsim/bin/onnxsim_option.h -> build/lib.linux-aarch64-3.6/onnxsim/bin
  running build_ext
  running cmake_build
  Run command ['/usr/bin/cmake', '-DPython_INCLUDE_DIR=/usr/include/python3.6m', '-DPython_EXECUTABLE=/usr/bin/python3', '-DPYTHON_EXECUTABLE=/usr/bin/python3', '-DBUILD_ONNX_PYTHON=OFF', '-DONNXSIM_PYTHON=ON', '-DONNXSIM_BUILTIN_ORT=OFF', '-DONNX_USE_LITE_PROTO=OFF', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DONNX_OPT_USE_SYSTEM_PROTOBUF=OFF', '-DCMAKE_BUILD_TYPE=Release', '-DONNX_ML=1', '/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4']
  CMake Error at CMakeLists.txt:1 (cmake_minimum_required):
    CMake 3.22 or higher is required.  You are running version 3.10.2

  -- Configuring incomplete, errors occurred!
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/setup.py", line 305, in <module>
      'onnxsim=onnxsim:main',
    File "/home/nvidia/.local/lib/python3.6/site-packages/setuptools/__init__.py", line 153, in setup
      return distutils.core.setup(**attrs)
    File "/usr/lib/python3.6/distutils/core.py", line 148, in setup
      dist.run_commands()
    File "/usr/lib/python3.6/distutils/dist.py", line 955, in run_commands
      self.run_command(cmd)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/usr/lib/python3/dist-packages/wheel/bdist_wheel.py", line 204, in run
      self.run_command('build')
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/usr/lib/python3.6/distutils/command/build.py", line 135, in run
      self.run_command(cmd_name)
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/setup.py", line 213, in run
      self.run_command('cmake_build')
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4/setup.py", line 187, in run
      subprocess.check_call(cmake_args)
    File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
      raise CalledProcessError(retcode, cmd)
  subprocess.CalledProcessError: Command '['/usr/bin/cmake', '-DPython_INCLUDE_DIR=/usr/include/python3.6m', '-DPython_EXECUTABLE=/usr/bin/python3', '-DPYTHON_EXECUTABLE=/usr/bin/python3', '-DBUILD_ONNX_PYTHON=OFF', '-DONNXSIM_PYTHON=ON', '-DONNXSIM_BUILTIN_ORT=OFF', '-DONNX_USE_LITE_PROTO=OFF', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DONNX_OPT_USE_SYSTEM_PROTOBUF=OFF', '-DCMAKE_BUILD_TYPE=Release', '-DONNX_ML=1', '/tmp/pip-install-kcoba0j4/onnxsim_947557263a5e46a98aa3f5fd3cd003a4']' returned non-zero exit status 1.
  ----------------------------------------
  ERROR: Failed building wheel for onnxsim
  Running setup.py clean for onnxsim
Failed to build onnxsim
Installing collected packages: onnxsim
    Running setup.py install for onnxsim ... /^canceled
ERROR: Operation cancelled by user

How can i solve this problem?

marcoslucianops commented 11 months ago

You can skip the onnxsim. It's only to simplify (using the --simplify) the model during the export process.

jhunmk29 commented 11 months ago

@marcoslucianops

thanks, but now I have another problem, I export oonx model, use yolov5-version7, and I input the command 'deepstream-app -c deepstream_app_config.txt', but App run failed.

nvidia@nvidia-desktop:~/yolov5-tensorrt/DeepStream-Yolo$ deepstream-app -c deepstream_app_config.txt

Using winsys: x11 
ERROR: Deserialize engine failed because file path: /home/nvidia/yolov5-tensorrt/DeepStream-Yolo/model_b1_gpu0_fp32.engine open error
0:00:01.921716540 16460     0x39e64440 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/home/nvidia/yolov5-tensorrt/DeepStream-Yolo/model_b1_gpu0_fp32.engine failed
0:00:01.921903387 16460     0x39e64440 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/home/nvidia/yolov5-tensorrt/DeepStream-Yolo/model_b1_gpu0_fp32.engine failed, try rebuild
0:00:01.921946779 16460     0x39e64440 INFO                 nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
ModelImporter.cpp:720: While parsing node number 141 [Resize -> "onnx::Concat_271"]:
ModelImporter.cpp:721: --- Begin node ---
ModelImporter.cpp:722: input: "onnx::Resize_266"
input: "onnx::Resize_270"
input: "onnx::Resize_445"
output: "onnx::Concat_271"
name: "Resize_141"
op_type: "Resize"
attribute {
  name: "coordinate_transformation_mode"
  s: "asymmetric"
  type: STRING
}
attribute {
  name: "cubic_coeff_a"
  f: -0.75
  type: FLOAT
}
attribute {
  name: "mode"
  s: "nearest"
  type: STRING
}
attribute {
  name: "nearest_mode"
  s: "floor"
  type: STRING
}

ModelImporter.cpp:723: --- End node ---
ModelImporter.cpp:726: ERROR: builtin_op_importers.cpp:3422 In function importResize:
[8] Assertion failed: scales.is_weights() && "Resize scales must be an initializer!"

Could not parse the ONNX model

Failed to build CUDA engine
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:02.140975759 16460     0x39e64440 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1934> [UID = 1]: build engine file failed
0:00:02.141073903 16460     0x39e64440 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2020> [UID = 1]: build backend context failed
0:00:02.141121454 16460     0x39e64440 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1257> [UID = 1]: generate backend failed, check config file settings
0:00:02.141196942 16460     0x39e64440 WARN                 nvinfer gstnvinfer.cpp:841:gst_nvinfer_start:<primary_gie> error: Failed to create NvDsInferContext instance
0:00:02.141227918 16460     0x39e64440 WARN                 nvinfer gstnvinfer.cpp:841:gst_nvinfer_start:<primary_gie> error: Config file path: /home/nvidia/yolov5-tensorrt/DeepStream-Yolo/config_infer_primary_yoloV5.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
** ERROR: <main:707>: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie: Failed to create NvDsInferContext instance
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(841): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie:
Config file path: /home/nvidia/yolov5-tensorrt/DeepStream-Yolo/config_infer_primary_yoloV5.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
App run failed

How can I solve this problem?

marcoslucianops commented 11 months ago

Add --opset 12 in the export command and try again.

jhunmk29 commented 11 months ago

@marcoslucianops I'm its not useful. Same error. I will show u my version details. 6206224cf5d67466dc9c1db76fa27d7 My yolov5 version is 7.

marcoslucianops commented 11 months ago

Are you using a custom model?

jhunmk29 commented 11 months ago

Are you using a custom model?

No, I use the yolov5s.pt to yolov5s.onnx.

jhunmk29 commented 11 months ago

@marcoslucianops I update to jetpac 5.0.1 dp, and it's ok. But when I run deepstream, it dont show the detection result, why. 032b424aae54bc6177cdc8506241953 this is my configuration file:

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0
onnx-file=yolov5s.onnx
model-engine-file=model_b1_gpu0_fp32.engine
#int8-calib-file=calib.table
labelfile-path=labels.txt
batch-size=1
network-mode=0
num-detected-classes=80
interval=0
gie-unique-id=1
process-mode=1
network-type=0
cluster-mode=2
maintain-aspect-ratio=1
symmetric-padding=1
#force-implicit-batch-dim=1
#workspace-size=1000
parse-bbox-func-name=NvDsInferParseYolo
#parse-bbox-func-name=NvDsInferParseYoloCuda
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
nms-iou-threshold=0.45
pre-cluster-threshold=0.25
topk=300

And in deepstream folder, i have the label file-"labels.txt". The yolov5s model is not custom

PigletPh commented 11 months ago

@jhunmk29 Hi. have you solved the problem, when I run deepstream, it dont also show the detection result.

jhunmk29 commented 11 months ago

@PigletPh no, but if i solve it, i will tell u

marcoslucianops commented 11 months ago

What is your PyTorch version?

jhunmk29 commented 11 months ago

@marcoslucianops pytorch 2.0 torchvision:0.16 image

jhunmk29 commented 11 months ago

@marcoslucianops I have used yolov5 detect.py to detect target in sample video (python3 detect.py --weights yolov5s.pt --source sample_1080p_h265.mp4 --device 0), and it can detect target. I think the pytorch environment is ok.

marcoslucianops commented 11 months ago

Can you try with PyTorch < 2.0?

marcoslucianops commented 11 months ago

I could be a problem in the new PyTouch when converting the layers to ONNX. That's why I asked to use previous version.

shuzhenglin commented 8 months ago

@jhunmk29 I think i have the same problem as you.Have you solve the problem,i also can't see the bounding box on screen

shuzhenglin commented 8 months ago

@jhunmk29 Pytorch vision 1.10.0 ,torchvision0.11.1 on my jetson, I think that is relatively low

puwenjie commented 7 months ago

I had the same problem as you and I have solved it Failed to build onnxsim or no module named 'onnxsim' Method: Directly pip3 install onnxsim Here you need to pay attention to your cmake version, the old version 1.18.3 does not work, I upgraded cmake to 3.27.5 can be, directpip3 install cmake == 3.27.5

marcoslucianops commented 7 months ago

You can skip the onnxsim installation. It's just used for simplify the model using the --simplify arg in the export file.