First of all, thank you very much for your work! When I was trying to replicate your work and converting ONNX to TensorRT, I encountered a warning: "onnx2trt_utils.cpp:365: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32." I ignored this warning, and in the end, there were multiple bounding boxes in the visualization. Upon reflection, I believe that I may need to convert your ONNX model to INT32 type . However, during the conversion, I encountered an error: "onnx.onnx_cpp2py_export.checker.ValidationError: Nodes in a graph must be topologically sorted, however input '/head/Mod_output_0' of node: name: '/head/Div OpType: Div is not the output of any previous nodes." Do you have any suggestions?
First of all, thank you very much for your work! When I was trying to replicate your work and converting ONNX to TensorRT, I encountered a warning: "onnx2trt_utils.cpp:365: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32." I ignored this warning, and in the end, there were multiple bounding boxes in the visualization. Upon reflection, I believe that I may need to convert your ONNX model to INT32 type . However, during the conversion, I encountered an error: "onnx.onnx_cpp2py_export.checker.ValidationError: Nodes in a graph must be topologically sorted, however input '/head/Mod_output_0' of node: name: '/head/Div OpType: Div is not the output of any previous nodes." Do you have any suggestions?