Closed jslok closed 8 months ago
The model is a custom trained rtmdet-ins from mmdetection/pytorch that I converted to onnx.
Remove this garbage-like post-processing. mmdetection/pytorch
If there is no activity within the next two days, this issue will be closed automatically.
@PINTO0309 Can you explain more please? I only enabled NMS but if I remove it, the output will be unusable. Maybe I misunderstand the situation.
Edit: Found the section about NMS in your docs. This is what you mean? https://github.com/PINTO0309/onnx2tf?tab=readme-ov-file#10-fixing-the-output-of-nonmaxsuppression-nms
@PINTO0309 Can you explain more please? I only enabled NMS but if I remove it, the output will be unusable. Maybe I misunderstand the situation.
Edit: Found the section about NMS in your docs. This is what you mean? https://github.com/PINTO0309/onnx2tf?tab=readme-ov-file#10-fixing-the-output-of-nonmaxsuppression-nms
Were you able to get it to convert? Am facing the same issue.
Issue Type
Others
OS
Windows
onnx2tf version number
1.19.11
onnx version number
1.15.0
onnxruntime version number
1.16.3
onnxsim (onnx_simplifier) version number
0.4.33
tensorflow version number
2.15.0
Download URL for ONNX
end2end.onnx.zip
Parameter Replacement JSON
Description
Exploring how to convert and quantize my model to tflite uint8.
onnx2tf -i end2end_320.onnx -oiqt -qt per-tensor -onwdt -ois input:1,3,320,320 -prf parameter_replacement.json
Tried some different values for param replacement json but the error always remains the same. I am clearly not understanding some things. Any help would be much appreciated!