Open chad-green opened 4 years ago
I'm having a same problem. It seems that the google drive that was linked had different file that the one that used for the conversion?
I have the same question !!!
@ravimashru Please help us !
@hoangkhoiLE I worked off the model weights already available here: https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/yolov4/model
You could maybe take a look at what changed in the repository https://github.com/hunglc007/tensorflow-yolov4-tflite.
@ravimashru Thank you for your reponse !
At first, when i used the model in https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/yolov4/model, it gives me an error:
RuntimeError: /onnxruntime_src/onnxruntime/core/session/inference_session.cc:215 onnxruntime::InferenceSession::InferenceSession(const onnxruntime::SessionOptions&, const string&, onnxruntime::logging::LoggingManager*) status.IsOK() was false. Given model could not be parsed while creating inference session. Error message: Protobuf parsing failed.
when i just tried to load the model: ''' sess = rt.InferenceSession("yolov4.onnx")
outputs = sess.get_outputs() output_names = list(map(lambda output: output.name, outputs)) input_name = sess.get_inputs()[0].name '''
I have tried some different versions onnx but the error is still exists.
So i continued to try with jupyter notebook Conversion: https://github.com/onnx/models/blob/master/vision/object_detection_segmentation/yolov4/dependencies/Conversion.ipynb
I have verified well the link of yolov4 weights, they keeps the same link with the same id. So the change in the link https://github.com/hunglc007/tensorflow-yolov4-tflite may not cause impact.
Just by curiosity, can you show us how you load model in : https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/yolov4/model and which env python and its dependencies that work well !
Thank you,
Best regards
I use it exactly like you: sess = rt.InferenceSession("yolov4.onnx")
The versions that are currently working for me:
numpy==1.19.4
onnx==1.7.0
onnxruntime==1.5.2
protobuf==3.13.0
six==1.15.0
typing-extensions==3.7.4.3
Also, I see that someone had a similar error in https://github.com/onnx/models/issues/306 which turned out to be due to a corrupted yolov4.onnx
file.
You could check the MD5 checksum of the file you have and see if it matches mine (2e0eeb4de8da2a0663ae3eb4a0dabbce) to ensure we have the same weights file. :)
I use it exactly like you:
sess = rt.InferenceSession("yolov4.onnx")
The versions that are currently working for me:
numpy==1.19.4 onnx==1.7.0 onnxruntime==1.5.2 protobuf==3.13.0 six==1.15.0 typing-extensions==3.7.4.3
Also, I see that someone had a similar error in #306 which turned out to be due to a corrupted
yolov4.onnx
file.You could check the MD5 checksum of the file you have and see if it matches mine (2e0eeb4de8da2a0663ae3eb4a0dabbce) to ensure we have the same weights file. :)
@ravimashru Thank you for your kind reponse. I redownload the file and it work right now. Thank you a lot and have a good day.
I have same issue. Anyone solved it?
This issue was solved by the following operation.
!git clone https://github.com/hunglc007/tensorflow-yolov4-tflite.git
%pushd tensorflow-yolov4-tflite/
!git reset --hard 3f8ed00b73a577e9a3cba8bc559f6064b565dbaa
Maybe, the implementation of TF model's outputs was changed on Jul 2-6 2020. https://github.com/hunglc007/tensorflow-yolov4-tflite/commits/master/save_model.py
YOLOv4 -> .onnx
conversion commands:
!python save_model.py --weights yolov4.weights --output yolov4.tf --input_size 416 --model yolov4
!python -m tf2onnx.convert --saved-model yolov4.tf --output yolov4_1_416_416_3_converted_by_myself.onnx --opset 11 --verbose
Log w/o !git reset
:
2021-02-16 05:52:32,083 - INFO - tf2onnx: inputs: ['input_1:0']
2021-02-16 05:52:32,084 - INFO - tf2onnx: outputs: ['Identity:0']
Log with !git reset
:
2021-02-16 05:26:46,820 - INFO - tf2onnx: inputs: ['input_1:0']
2021-02-16 05:26:46,821 - INFO - tf2onnx: outputs: ['Identity:0', 'Identity_1:0', 'Identity_2:0']
My package versions:
easydict 1.9
numpy 1.19.0
onnx 1.7.0
onnxruntime 1.6.0
protobuf 3.12.2
tensorflow 2.2.0
tf2onnx 1.9.0
Hi,
i could help myself with:
def reshape_output(trt_output):
if len(trt_output) % (52*52) == 0:
return trt_output.reshape(1, 52, 52, 3, 85)
elif len(trt_output) % (26*26) == 0:
return trt_output.reshape(1, 26, 26, 3, 85)
elif len(trt_output) % (13*13) == 0:
return trt_output.reshape(1, 13, 13, 3, 85)
else:
print('unknown trt_output size {}'.format(len(trt_output)))
return []
Still the resulting coordinates are far off from where they should be. Might be rounding issue (tensorrt) or model misuse, i can't tell yet.
Hope above helps, keep me posted if you find a way to make it work out.
Best, Jan
Bug Report
Which model does this pertain to?
yolov4
Describe the bug
Attempted to replicate results using the steps described in Conversion.ipynb The result is a much different graph that gives errors when running with the inference.ipynb It only generates one output of shape: Output shape: [(1, 33, 84)]
Whereas your onnx model has output shape: Output shape: [(1, 52, 52, 3, 85), (1, 26, 26, 3, 85), (1, 13, 13, 3, 85)] How did you change the output shape?
Reproduction instructions
System Information
OS Platform and Distribution (Linux Ubuntu 18.04):
ONNX version (1.7.0):
Backend/Runtime version (ONNX Runtime 1.5.2):
Provide a code snippet to reproduce your errors. run the suggested tutorials as written: https://github.com/onnx/models/blob/master/vision/object_detection_segmentation/yolov4/dependencies/Conversion.ipynb https://github.com/onnx/models/blob/master/vision/object_detection_segmentation/yolov4/dependencies/inference.ipynb