Closed gcunhase closed 3 months ago
@DerryHub @firestonelib
This is due to the plugin inputs/outputs not having their types defined. After fixing this (manually with ORT
+onnx_graphsurgeon
), the model can run with trtexec.
Is there a way for the ONNX model to be exported with tensor types already?
Error
TensorRT 10.0.1.6, RTX 3090:
Steps to reproduce