MPolaris / onnx2tflite

Tool for onnx->keras or onnx->tflite. Hope this tool can help you.
Apache License 2.0
504 stars 40 forks source link

YOLOv5n ONNX to TFLite export not working with `KeyError: 'onnx::Resize_448'` #8

Closed glenn-jocher closed 2 years ago

glenn-jocher commented 2 years ago

Hi! I'm getting an error in Google Colab on attempting to convert an ONNX model to TFLite using this tool. I've attached the ONNX model causing the bug, and also provided full code to reproduce below. I tried YOLOv5n models exported to ONNX using opset 11, 12 and 13, and saw problems in all of them.

Code to Reproduce

!git clone https://github.com/ultralytics/yolov5
!git clone https://github.com/MPolaris/onnx2tflite
%pip install -qr yolov5/requirements.txt  # install

!python yolov5/export.py --weights ./yolov5n.pt --include onnx

%cd onnx2tflite
from converter import onnx_converter
onnx_converter(
    onnx_model_path = "../yolov5n.onnx",
    need_simplify = True,
    output_path = "./",
    target_formats = ['tflite'], # or ['keras'], ['keras', 'tflite']
    weight_quant = False,
    int8_model = False,
    int8_mean = None,
    int8_std = None,
    image_root = None
)

Result

WARNING:onnx_loader running::模型优化失败, 从../yolov5n.onnx加载
/content/onnx2tflite/onnx2tflite
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
[<ipython-input-6-5f2991b979cc>](https://localhost:8080/#) in <module>
     16     int8_mean = None,
     17     int8_std = None,
---> 18     image_root = None
     19 )

2 frames
[/content/onnx2tflite/layers/common_layers.py](https://localhost:8080/#) in __init__(self, tensor_grap, node_weights, node_inputs, node_attribute, *args, **kwargs)
    199         else:
    200             # 从scales取
--> 201             _, _, nh, nw = node_weights[node_inputs[2]]
    202             _, h, w, _ = tensor_grap[node_inputs[0]].shape
    203             nh, nw = int(h*nh), int(w*nw)

KeyError: 'onnx::Resize_445'
Screenshot 2022-08-31 at 00 55 11

@ayushExel

glenn-jocher commented 2 years ago

Sorry, forgot to attach my ONNX model earlier. I've attached it here:

yolov5n.onnx.zip

Model opens correctly in Netron, it is the official YOLOv5 v6.2 nano model exported to ONNX, not a custom trained model.

Screenshot 2022-08-31 at 00 58 47
MPolaris commented 2 years ago

Sorry, forgot to attach my ONNX model earlier. I've attached it here:

yolov5n.onnx.zip

Model opens correctly in Netron, it is the official YOLOv5 v6.2 nano model exported to ONNX, not a custom trained model.

Screenshot 2022-08-31 at 00 58 47

Thanks for your issue, I have tested the onnx model and successed. The success key is you should make need_simplify = True. Your error maybe caused by onnx-simplifier, the step of simplify is very import for convertion. My code is lastest, and requirments' version as follow:

python 3.8
tensorflow 2.8.0
onnx 1.11.0
onnx-simplifier 0.3.8 or 0.4.8
glenn-jocher commented 2 years ago

@MPolaris awesome, thanks for the help! Your solution works. Code to reproduce correct export:

!git clone https://github.com/ultralytics/yolov5
!git clone https://github.com/MPolaris/onnx2tflite
%pip install -qr yolov5/requirements.txt  # install

!python yolov5/export.py --weights ./yolov5n.pt --include onnx --simplify

%cd onnx2tflite
from converter import onnx_converter
onnx_converter(
    onnx_model_path = "../yolov5n.onnx",
    need_simplify = True,
    output_path = "./",
    target_formats = ['tflite'], # or ['keras'], ['keras', 'tflite']
    weight_quant = False,
    int8_model = False,
    int8_mean = None,
    int8_std = None,
    image_root = None
)
MPolaris commented 2 years ago

@glenn-jocher I'm glad to help you and please star this repository, thanks for your issue again.