ultralytics / yolov5

YOLOv5 πŸš€ in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
50.68k stars 16.33k forks source link

ONNX: export failure ❌ 0.0s: No module named 'onnx' #11819

Closed NevilleMthw closed 1 year ago

NevilleMthw commented 1 year ago

Search before asking

YOLOv5 Component

Export

Bug

export: data=data/coco128.yaml, weights=['best_yolov5s_handwave1.pt'], imgsz=[640, 640], batch_size=1, device=0, half=False, inplace=False, keras=False, optimize=False, int8=False, dynamic=False, simplify=False, opset=12, verbose=False, workspace=4, nms=False, agnostic_nms=False, topk_per_class=100, topk_all=100, iou_thres=0.45, conf_thres=0.25, include=['onnx']
YOLOv5 πŸš€ v7.0-0-g915bbf2 Python-3.6.9 torch-1.9.0 CUDA:0 (NVIDIA Tegra X2, 7850MiB)

Fusing layers... 
YOLOv5s summary: 157 layers, 7012822 parameters, 0 gradients

PyTorch: starting from best_yolov5s_handwave1.pt with output shape (1, 25200, 6) (13.7 MB)
WARNING ⚠️ Python 3.7.0 is required by YOLOv5, but Python 3.6.9 is currently installed
ONNX: export failure ❌ 0.0s: No module named 'onnx'

Environment

OS: Ubuntu 18.04 Bionic Beaver Python: 3.6.9 Platform: NVIDIA JETSON TX2 Jetpack: 4.6.4 Cuda: 10.2 TensorRT: 8.2.1.32

Minimal Reproducible Example

python3 export.py --weights best_yolov5s_handwave1.pt --include onnx --device 0

Additional

I have installed the ONNX runtime through Jetson Zoo (https://elinux.org/Jetson_Zoo#ONNX_Runtime) for TX2 on Python 3.6. Even though YOLOv5 supports 3.7 onwards, 3.6.9 version works fine and there is no issues when I try to run detect.py code for inferencing. For the export.py, I have used it to convert to tflite and it works well with 3.6.9. I have not faced any issues with the python version on my edge device.

Are you willing to submit a PR?

github-actions[bot] commented 1 year ago

πŸ‘‹ Hello @NevilleMthw, thank you for your interest in YOLOv5 πŸš€! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution.

If this is a πŸ› Bug Report, please provide a minimum reproducible example to help us debug it.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset image examples and training logs, and verify you are following our Tips for Best Training Results.

Requirements

Python>=3.7.0 with all requirements.txt installed including PyTorch>=1.7. To get started:

git clone https://github.com/ultralytics/yolov5  # clone
cd yolov5
pip install -r requirements.txt  # install

Environments

YOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

YOLOv5 CI

If this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training, validation, inference, export and benchmarks on macOS, Windows, and Ubuntu every 24 hours and on every commit.

Introducing YOLOv8 πŸš€

We're excited to announce the launch of our latest state-of-the-art (SOTA) object detection model for 2023 - YOLOv8 πŸš€!

Designed to be fast, accurate, and easy to use, YOLOv8 is an ideal choice for a wide range of object detection, image segmentation and image classification tasks. With YOLOv8, you'll be able to quickly and accurately detect objects in real-time, streamline your workflows, and achieve new levels of accuracy in your projects.

Check out our YOLOv8 Docs for details and get started with:

pip install ultralytics
glenn-jocher commented 1 year ago

@NevilleMthw the issue you're experiencing is related to the ONNX module not being found during the export process. This can happen if the required dependencies for ONNX are not installed.

To resolve this issue, please ensure that you have the ONNX module installed correctly. You can install it by running pip install onnx.

Additionally, please make sure that all the other required dependencies are installed as well. You can find the complete list of dependencies in the requirements.txt file in the YOLOv5 repository.

If the issue persists after installing the ONNX module, please let us know and provide any additional information or error messages that you may have encountered. We'll be happy to assist you further!

Thank you for your willingness to contribute by submitting a PR! We appreciate your support and collaboration to make YOLOv5 even better.

NevilleMthw commented 1 year ago

Hi @glenn-jocher, thanks for the reply. I am facing an issue when trying to install ONNX through pip.

WARNING: pip is being invoked by an old script wrapper. This will fail in a future version of pip.
Please see https://github.com/pypa/pip/issues/5599 for advice on fixing the underlying issue.
To avoid this problem you can invoke Python with '-m pip' instead of running pip directly.
Defaulting to user installation because normal site-packages is not writeable
Collecting onnx==1.9.0
  Using cached onnx-1.9.0.tar.gz (9.8 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: protobuf in /usr/lib/python3/dist-packages (from onnx==1.9.0) (3.0.0)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/nvidia/.local/lib/python3.6/site-packages (from onnx==1.9.0) (4.1.1)
Requirement already satisfied: six in /usr/lib/python3/dist-packages (from onnx==1.9.0) (1.11.0)
Requirement already satisfied: numpy>=1.16.6 in /home/nvidia/.local/lib/python3.6/site-packages (from onnx==1.9.0) (1.19.4)
Building wheels for collected packages: onnx
  Building wheel for onnx (pyproject.toml) ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 /home/nvidia/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmpqnn_zec7
       cwd: /tmp/pip-install-zjh3j4ry/onnx_91197dd371094a188738ebfd4c2dc4c4
  Complete output (82 lines):
  fatal: not a git repository (or any of the parent directories): .git
  running bdist_wheel
  running build
  running build_py
  running create_version
  running cmake_build
  Using cmake args: ['/usr/bin/cmake', '-DPYTHON_INCLUDE_DIR=/usr/include/python3.6m', '-DPYTHON_EXECUTABLE=/usr/bin/python3', '-DBUILD_ONNX_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DONNX_ML=1', '/tmp/pip-install-zjh3j4ry/onnx_91197dd371094a188738ebfd4c2dc4c4']
  -- The C compiler identification is GNU 7.5.0
  -- The CXX compiler identification is GNU 7.5.0
  -- Check for working C compiler: /usr/bin/cc
  -- Check for working C compiler: /usr/bin/cc -- works
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Check for working CXX compiler: /usr/bin/c++
  -- Check for working CXX compiler: /usr/bin/c++ -- works
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found PythonInterp: /usr/bin/python3 (found version "3.6.9")
  -- Found PythonLibs: /usr/lib/aarch64-linux-gnu/libpython3.6m.so (found version "3.6.9")
  Generated: /tmp/pip-install-zjh3j4ry/onnx_91197dd371094a188738ebfd4c2dc4c4/.setuptools-cmake-build/onnx/onnx-ml.proto
  CMake Error at CMakeLists.txt:292 (message):
    Protobuf compiler not found
  Call Stack (most recent call first):
    CMakeLists.txt:323 (relative_protobuf_generate_cpp)

  -- Configuring incomplete, errors occurred!
  See also "/tmp/pip-install-zjh3j4ry/onnx_91197dd371094a188738ebfd4c2dc4c4/.setuptools-cmake-build/CMakeFiles/CMakeOutput.log".
  /tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/dist.py:726: UserWarning: Usage of dash-separated 'license-file' will not be supported in future versions. Please use the underscore name 'license_file' instead
    % (opt, underscore_opt)
  Traceback (most recent call last):
    File "/home/nvidia/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
      main()
    File "/home/nvidia/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/home/nvidia/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 262, in build_wheel
      metadata_directory)
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 231, in build_wheel
      wheel_directory, config_settings)
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 215, in _build_with_temp_dir
      self.run_setup()
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 268, in run_setup
      self).run_setup(setup_script=setup_script)
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 158, in run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 359, in <module>
      'backend-test-tools = onnx.backend.test.cmd_tools:main',
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/setuptools/__init__.py", line 153, in setup
      return distutils.core.setup(**attrs)
    File "/usr/lib/python3.6/distutils/core.py", line 148, in setup
      dist.run_commands()
    File "/usr/lib/python3.6/distutils/dist.py", line 955, in run_commands
      self.run_command(cmd)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/tmp/pip-build-env-2e3jebon/overlay/lib/python3.6/site-packages/wheel/bdist_wheel.py", line 299, in run
      self.run_command('build')
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/usr/lib/python3.6/distutils/command/build.py", line 135, in run
      self.run_command(cmd_name)
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "setup.py", line 233, in run
      self.run_command('cmake_build')
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "setup.py", line 219, in run
      subprocess.check_call(cmake_args)
    File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
      raise CalledProcessError(retcode, cmd)
  subprocess.CalledProcessError: Command '['/usr/bin/cmake', '-DPYTHON_INCLUDE_DIR=/usr/include/python3.6m', '-DPYTHON_EXECUTABLE=/usr/bin/python3', '-DBUILD_ONNX_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DONNX_ML=1', '/tmp/pip-install-zjh3j4ry/onnx_91197dd371094a188738ebfd4c2dc4c4']' returned non-zero exit status 1.
  ----------------------------------------
  ERROR: Failed building wheel for onnx
Failed to build onnx
ERROR: Could not build wheels for onnx, which is required to install pyproject.toml-based projects
glenn-jocher commented 1 year ago

It looks like the error you are facing is due to the fact that the Protobuf compiler is not found during the build process. The relevant line from your error output is:

CMake Error at CMakeLists.txt:292 (message):
Protobuf compiler not found

ONNX requires the Protobuf compiler to build from source. Here are the steps to resolve this issue:

  1. First, make sure that you have the Protobuf compiler installed. You can install it via your system's package manager. On Ubuntu/Debian-based systems, you can do this by running:

    sudo apt-get install protobuf-compiler libprotobuf-dev

    On other systems, the command might be different.

  2. Also, you might want to update pip to the latest version if it’s not updated:

    python -m pip install --upgrade pip

    or if you're using Python 3:

    python3 -m pip install --upgrade pip
  3. Once you have the Protobuf compiler installed, try installing ONNX again:

    pip install onnx==1.9.0

    or

    python -m pip install onnx==1.9.0
  4. If you still face issues, it might be related to the version of Python you are using or other system dependencies. Make sure that you are using a supported version of Python for the ONNX version you are trying to install.

Please note that sometimes building libraries with native extensions can be tricky, and you might need to ensure that your system has all the required dependencies installed.

NevilleMthw commented 1 year ago

Hi @glenn-jocher, thanks for the feedback. This definitely solved the issue. Thanks a lot.πŸ‘πŸ»

Also, this is not related but would it be possible to check this discussion? https://github.com/ultralytics/yolov5/discussions/11816

glenn-jocher commented 11 months ago

@NevilleMthw you're welcome, and I'm glad to hear that the issue has been resolved!

Regarding the discussion you've mentioned, I'll definitely take a look at it and see if I can provide any assistance. Thank you for bringing it to my attention.

If you have any further questions or need assistance with anything else, feel free to ask. We're here to help!

BedoHatem commented 2 months ago

Fix = I changed from Python 3.10 to 3.9

I got this Error msg while following this "https://docs.ultralytics.com/integrations/onnx/#usage", its an ultralytics tutorial on converting a yolo to onnx.

Fix = I changed from Python 3.10 to 3.9

glenn-jocher commented 2 months ago

Thanks for sharing your solution. If you encounter any more issues, ensure you're using the latest version of the repository and dependencies. If you need further assistance, feel free to ask.