ultralytics / yolov5

YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
50.66k stars 16.33k forks source link

Yet another export yolov5 models to ONNX and inference with TensorRT #1597

Closed linghu8812 closed 3 years ago

linghu8812 commented 3 years ago

Hello everyone, here is a repo that can convert the yolov5 model to ONNX model and inference with TensorRT. The code is here: https://github.com/linghu8812/tensorrt_inference/tree/master/project/yolov5. It supported all yolov5 models including yolov5s, yolov5m, yolov5l and yolov5x. An onnxsim module has been imported to simplify the yolov5 structure. Before simplify the yolov5 onnx structure was shown like this:

image

after simplified the onnx model has been simplified to:

image

some extra node have been simplified.

In addition, a TensorRT inference code has also been supplied, the inference result has been shown below:

image

github-actions[bot] commented 3 years ago

Hello @linghu8812, thank you for your interest in 🚀 YOLOv5! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution.

If this is a 🐛 Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available.

For business inquiries or professional support requests please visit https://www.ultralytics.com or email Glenn Jocher at glenn.jocher@ultralytics.com.

Requirements

Python 3.8 or later with all requirements.txt dependencies installed, including torch>=1.7. To install run:

$ pip install -r requirements.txt

Environments

YOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

CI CPU testing

If this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), testing (test.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit.

glenn-jocher commented 3 years ago

@linghu8812 very nice! Did you have to configure onnxsim especially to achieve those simplifications or did it do them on its own?

linghu8812 commented 3 years ago

@glenn-jocher hello,

first of all, onnx-simplifier need to be installed with pip install onnx-simplifier, then, the simplification codes are:

    # ONNX export
    try:
        import onnx
        from onnxsim import simplify

        print('\nStarting ONNX export with onnx %s...' % onnx.__version__)
        f = opt.weights.replace('.pt', '.onnx')  # filename
        torch.onnx.export(model, img, f, verbose=False, opset_version=12, input_names=['images'],
                          output_names=['output'] if y is None else ['output'])

        # Checks
        onnx_model = onnx.load(f)  # load onnx model
        model_simp, check = simplify(onnx_model)
        assert check, "Simplified ONNX model could not be validated"
        onnx.save(model_simp, f)
        # print(onnx.helper.printable_graph(onnx_model.graph))  # print a human readable model
        print('ONNX export success, saved as %s' % f)
    except Exception as e:
        print('ONNX export failure: %s' % e)
al03 commented 3 years ago

@glenn-jocher hello,

first of all, onnx-simplifier need to be installed with pip install onnx-simplifier, then, the simplification codes are:

    # ONNX export
    try:
        import onnx
        from onnxsim import simplify

        print('\nStarting ONNX export with onnx %s...' % onnx.__version__)
        f = opt.weights.replace('.pt', '.onnx')  # filename
        torch.onnx.export(model, img, f, verbose=False, opset_version=12, input_names=['images'],
                          output_names=['output'] if y is None else ['output'])

        # Checks
        onnx_model = onnx.load(f)  # load onnx model
        model_simp, check = simplify(onnx_model)
        assert check, "Simplified ONNX model could not be validated"
        onnx.save(model_simp, f)
        # print(onnx.helper.printable_graph(onnx_model.graph))  # print a human readable model
        print('ONNX export success, saved as %s' % f)
    except Exception as e:
        print('ONNX export failure: %s' % e)

I tried this export script, but don't get the simplified layer like you showed up.

My layer info of outputs is this:

image

glenn-jocher commented 3 years ago

@al03 did the model simplify at all? I'm interested to see if it works, I'll try myself.

glenn-jocher commented 3 years ago

@al03 @linghu8812 I get an error on import onnxsim, I'm not able to evaluate it. I used pip install onnx-simplifier with Python 3.8.0: https://pypi.org/project/onnx-simplifier/

Screen Shot 2020-12-22 at 3 23 20 PM

Will raise an issue on the onnxsim repo. EDIT: issue raised https://github.com/daquexian/onnx-simplifier/issues/109

linghu8812 commented 3 years ago

https://github.com/linghu8812/yolov5/blob/bc2874fe025430e6f710b7e26054646f88c4e86e/models/yolo.py#L43-L63

@al03 the yolo.py file has a little changed, it makes convenience for C++ decoding boxes from tensors.

austingg commented 3 years ago

@glenn-jocher you may install onnxruntime library first.

glenn-jocher commented 3 years ago

@austingg yes it seems so according to https://github.com/daquexian/onnx-simplifier/issues/109#issuecomment-749966150. I was interested in using onnxsim as part of the default code, but I looked at the install instructions (https://www.onnxruntime.ai/docs/get-started/install.html) and the prerequisites and OS-specific instructions appear too burdensome to include as part of the default repo. I think it would cause users more confusion/problems than it would solve.

If this is a common use case though (and it seems it may be) it might make sense to place these instructions within a Tutorial that we could add to https://docs.ultralytics.com/yolov5. That way expert users could still benefit.

daquexian commented 3 years ago

@austingg yes it seems so according to daquexian/onnx-simplifier#109 (comment). I was interested in using onnxsim as part of the default code, but I looked at the install instructions (https://www.onnxruntime.ai/docs/get-started/install.html) and the prerequisites and OS-specific instructions appear too burdensome to include as part of the default repo. I think it would cause users more confusion/problems than it would solve.

@glenn-jocher I have updated onnx-simplifier to v0.2.26 so that it depends on onnxruntime-noopenmp instead of onnxruntime according to https://github.com/microsoft/onnxruntime/issues/6511. I believe all instructions in https://www.onnxruntime.ai/docs/get-started/install.html is not indeed needed if we don't depend on openmp, and onnx-simplifier will work like a charm without any additional instructions. Could you please give onnx-simplifier a try? :D

glenn-jocher commented 3 years ago

@daquexian oh really? I actually gave up on the process before after seeing the complicated dependency requirements. So what exactly are the pip installs required now to use onnx-simplifier? It would be nice to integrate it into export.py if we can get simple dependencies and the installs all pass the CI checks on the 3 main OS's.

glenn-jocher commented 3 years ago

@daquexian the current CI checks do an ONNX export BTW here: https://github.com/ultralytics/yolov5/runs/1803124986?check_suite_focus=true

Screen Shot 2021-02-01 at 10 28 47 AM

The export tests are defined here: https://github.com/ultralytics/yolov5/blob/be9edffded6b690168e8b92dd33cf471d09f8f13/.github/workflows/ci-testing.yml#L79

Failures in an export won't fail the CI as they are in try except clauses, but it provides nice realtime insight into whether they are working or not, since the tests run every day.

daquexian commented 3 years ago

So what exactly are the pip installs required now to use onnx-simplifier?

Just update https://github.com/ultralytics/yolov5/blob/master/.github/workflows/ci-testing.yml#L51 with pip install -q onnx onnx-simplifier>=0.2.26 and update model/export.py like https://github.com/ultralytics/yolov5/issues/1597#issuecomment-738475425. No extra instructions are needed :D

glenn-jocher commented 3 years ago

@daquexian I was able to install in Colab, but not locally on macos for some reason. Python 3.9 appears incompatible, so I tried with a 3.8.0 environment, but got this. Do you know what the issue might be?

Screen Shot 2021-02-01 at 8 10 19 PM
daquexian commented 3 years ago

@glenn-jocher Oh it's my fault.. onnxruntime-noopenmp doesn't have a macos version. That's so strange. I'll update this issue when onnxruntime has better macos support.

al03 commented 3 years ago

@al03 did the model simplify at all? I'm interested to see if it works, I'll try myself. It did work after convert to other format model, but it need to implement the detection layer(grade, sigmoid, nms).

linghu8812 commented 3 years ago

linghu8812/tensorrt_inference#42

It is supported yolov5 4.0, yolov5s6.pt models and so on to inference with TensorRT now!

for yolov5s6, just run with

./yolov5_trt ../config6.yaml ../samples
github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

PiyalGeorge commented 3 years ago

@linghu8812 , I converted the yolov5s model to onnx model following this . Then i simplified the onnx model using this . I tried this in yolov5s - release number :3.1

I wish to know "the output nodes before 5D Reshape". How can i get these output nodes? Kindly help me to know these output nodes.

glenn-jocher commented 3 years ago

@PiyalGeorge I'm not sure exactly what you mean by the output nodes before 5D reshape, though the 5D reshape is in the Detect layer, so what you are looking for is probably there.

https://github.com/ultralytics/yolov5/blob/5f7d39fede4de8af98472bd009c63c3a86568e2d/models/yolo.py#L24-L58

On a side note, onnx-simplifier is now integrated with YOLOv5 export via PR https://github.com/ultralytics/yolov5/pull/2815, you can access it like this in the latest code:

python export.py --simplify
bertinma commented 3 years ago

@glenn-jocher Detect Layer still cannot be exported to onnx ?

glenn-jocher commented 3 years ago

@bertinma yes Detect() layer exports to ONNX with export.py.

ithmz commented 3 years ago

Hi, may I ask how you get the last output layer? (1,25200,85) Thanks

glenn-jocher commented 3 years ago

@tsangz189 YOLOv5 output grids are flattened and concatenated to form a single output.

python models/export.py --grid --simplify
Screenshot 2021-04-23 at 13 17 14
ithmz commented 3 years ago

@glenn-jocher Hi Jocher, thanks for your reply, but when I run the script python models/export.py --simplify the outputs are not concatenated to form a single output

glenn-jocher commented 3 years ago

@tsangz189 --grid forms the single output:

python models/export.py --grid --simplify
ttanzhiqiang commented 3 years ago

you can https://github.com/ttanzhiqiang/onnx_tensorrt_project.git