Open haiderasad opened 3 months ago
You will need to make the batch size to be -1 in your ONNX model to enable dynamic batching for your TensorRT engine.
like this?
Yes. But there are many other caveats of using dynamic batching. This example emphasizes on using the TensorRT custom plugin interface. Dynamic batching is considered as a little bit more "advanced" feature and is not covered in this example.
If you are interested in using dynamic batching for your TensorRT engine and custom plugins, please refer to TensorRT developer guide for guidance.
Thanks for the info, i was able to get the batching to work but it's just the host_device_buffer in inputs that needs to be dynamically set based on runtime input batch size
Hey, nice work, the dynamic batching flow is a bit broken I think it works fine if the engine is built on one shape, but when built with
when I give it an image of (6, 360, 640) it says
ValueError: could not broadcast input array from the shape (1382400,) into shape (230400,)
upon investigating I see that the shape and size of the inputs is set to (10, 360, 640) so its expecting (10, 360, 640) , I don't know why , so are you aware of what the best practice in tensorrt to handle dynamic inputs?
below is my whole code
engine building
Main.py