Closed phamkhactu closed 1 year ago
Your code is wrong, you can refer to https://github.com/NVIDIA/TensorRT/blob/1d6bf36034bb07fc88d924ebf5d34d358298e545/samples/python/efficientdet/build_engine.py#L168 or any other TRT samples in the repo.
@zerollzeng Thank you, but when I change code as you suggest:
def build_detec_engine(onnx_path, using_half=True, dynamic_input=True, workspace_size=2,
min_shape=(1,3,256,256), opt_shape=(1,3,640,960), max_shape=(1,3,1280,1280)):
trt.init_libnvinfer_plugins(None, '')
with trt.Builder(TRT_LOGGER) as builder, builder.create_network(1) as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
builder.max_batch_size = 1 # always 1 for explicit batch
config = builder.create_builder_config()
config.max_workspace_size = GiB(int(workspace_size))
if using_half:
config.set_flag(trt.BuilderFlag.FP16)
# Load the Onnx model and parse it in order to populate the TensorRT network.
with open(onnx_path, 'rb') as model:
if not parser.parse(model.read()):
print ('ERROR: Failed to parse the ONNX file.')
for error in range(parser.num_errors):
print (parser.get_error(error))
return None
if dynamic_input:
profile = builder.create_optimization_profile();
profile.set_shape("x", min_shape, opt_shape, max_shape)
config.add_optimization_profile(profile)
return builder.build_serialized_network(network, config)
I get error:
with trt.Builder(TRT_LOGGER) as builder, builder.create_network(1) as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
TypeError: pybind11::init(): factory function returned nullptr
Tensorrt version is: 8.5.3.1 Can you give me some advance to solve it? Thank you.
I have found that this issue comes from version conflict, some layer can not be converted to tensorrt layer.
Thank you very much.
What version of tensorrt did you use to solve the problem
Thanks for great repo.
I want to convert model onnx with dynamic input x=[1,3,-1,-1] to RT(my tensorrt version is 8.5.3.1), but when I run script to convert I get error:
I think this error comes from:
Here my code:
Thank you very much for your help.