Closed snazau closed 2 years ago
I solved the issue by changing the source code for my specific model.
The problem for me was that for some reason library takes only a single sample as input:
inputs = [tensor.clone()[0:1] for tensor in inputs]
and then during inference there is additional dimension that is equal to batch size
shape = (batch_size,) + tuple(self.engine.get_binding_shape(idx))
Here is the code that I'm using now trt_functions.txt As a result, I got inference about 2x faster than PyTorch.
Goal I want to speed up the segmentation model from segmentation_models.pytorch library. I would like to use a model with a fixed value of the batch size > 1
Describe the bug I use torch2trt function with use_onnx=True. It successfully converts the model, but during the inference, trt model predicts the same as PyTorch model only for the second sample in batch (in case of batch_size > 1, with batch_size=1 it works well)
System information
To Reproduce I have conda environment torch2trt.txt. I've noticed that something is wrong while I test it with some random inputs of different shapes (code: main.txt). Also, my guesses are confirmed by strange distributions of predictions (I uploaded screenshots of distributions to google drive).
Then I built the trt model into the existing pipeline and looked at the predictions on real data with different batch sizes. It became obvious that only one example from the batch is predicted by the trt model correctly (image predictions are in "image_prediciton" folder on google drive)
My attempts to fix Since I'm using
onnx=True
I replacedtorch.onnx.export(module, inputs, f, input_names=input_names, output_names=output_names)
withtorch.onnx.export(module, inputs, f, input_names=input_names, output_names=output_names, dynamic_axes={'input_0': {0: 'batch_size'}, 'output_0': {0: 'batch_size'}})
in torch2trt function to enable arbitrary batch size but it caused an error:Also, I tried
onnx=False
but then it failed to convert the model with the following error:If you need any additional information feel free to ask