NVIDIA-AI-IOT / torch2trt

An easy to use PyTorch to TensorRT converter
MIT License
4.55k stars 671 forks source link

[TensorRT] ERROR: Cannot deserialize plugin interpolate #37

Closed mitsuix closed 5 years ago

mitsuix commented 5 years ago

I install torch2trt by --plugins, and I succeed in convert my model to tensorrt model. And I use torch.save(model_trt.state_dict(), 'test.pth') then model_trt = TRTModule() model_trt.load_state_dict(torch.load('test.pth')) I met this problem: [TensorRT] ERROR: getPluginCreator could not find plugin interpolatetorch2trt version 1 namespace torch2trt [TensorRT] ERROR: Cannot deserialize plugin interpolate

Any suggestions? Thanks!

mitsuix commented 5 years ago

I pulled the lastest version, there are no errors during loading the model. but I met another problem: File "/home/kejia/workspace_pycharm/github/torch2trt/torch2trt/torch2trt.py", line 228, in forward self.context.execute_async(batch_size, bindings, torch.cuda.current_stream().cuda_stream) RuntimeError: Dimension out of range (expected to be in range of [-1, 0], but got 1) (maybe_wrap_dim at /pytorch/c10/core/WrapDimMinimal.h:20)

jaybdub commented 5 years ago

Hi mitsuix,

Thanks for reaching out.

Sorry, there was an issue with initialization of the interpolate plugin after deserializing from file.

I've fixed this in this commit:

https://github.com/NVIDIA-AI-IOT/torch2trt/commit/c3889fb37553a474672d6c736dd9b47048c521b4

You'll need to re-build the TensorRT engine, but you should be able to then save/load/execute properly.

Please let me know if this works for you.

Best, John

mitsuix commented 5 years ago

It works for me~ Thank you for your help^ ^

Hi mitsuix,

Thanks for reaching out.

Sorry, there was an issue with initialization of the interpolate plugin after deserializing from file.

I've fixed this in this commit:

c3889fb

You'll need to re-build the TensorRT engine, but you should be able to then save/load/execute properly.

Please let me know if this works for you.

Best, John