NVIDIA-AI-IOT / torch2trt

An easy to use PyTorch to TensorRT converter
MIT License
4.59k stars 675 forks source link

group norm related problem #793

Open thorstenwagner opened 2 years ago

thorstenwagner commented 2 years ago

Hi,

I try to get torch2trt running but I get group norm related problems:

Warning: Encountered known unsupported method torch.zeros
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.__hash__
Warning: Encountered known unsupported method torch.Tensor.get_device
Warning: Encountered known unsupported method torch.Tensor.is_complex
[08/25/2022-12:46:24] [TRT] [E] 2: [symbolicDims.cpp::ndGetC::116] Error Code 2: Internal Error (Assertion d.nbDims >= nbSpatialDims + 1 failed. )
Traceback (most recent call last):
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/bin/tomotwin_embed.py", line 33, in <module>
    sys.exit(load_entry_point('tomotwin', 'console_scripts', 'tomotwin_embed.py')())
  File "/mnt/data/twagner/Projects/TomoTwin/src/tomotwin-github/tomotwin/embed_main.py", line 424, in _main_
    embedor = TorchEmbedor(
  File "/mnt/data/twagner/Projects/TomoTwin/src/tomotwin-github/tomotwin/modules/inference/embedor.py", line 463, in __init__
    self.model = torch2trt(self.model, [x])
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch2trt-0.4.0-py3.9.egg/torch2trt/torch2trt.py", line 736, in torch2trt
    outputs = module(*inputs)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/parallel/data_parallel.py", line 166, in forward
    return self.module(*inputs[0], **kwargs[0])
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/mnt/data/twagner/Projects/TomoTwin/src/tomotwin-github/tomotwin/modules/networks/SiameseNet3D.py", line 528, in forward
    out = self.conv_layer0(inputtensor)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/container.py", line 139, in forward
    input = module(input)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch/nn/modules/normalization.py", line 272, in forward
    return F.group_norm(
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch2trt-0.4.0-py3.9.egg/torch2trt/torch2trt.py", line 307, in wrapper
    converter["converter"](ctx)
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch2trt-0.4.0-py3.9.egg/torch2trt/converters/group_norm.py", line 42, in convert_group_norm
    a_var, eps_trt = broadcast_trt_tensors(ctx.network, [a_var, eps_trt], len(split_shape))
  File "/opt/user_software/miniconda3/envs/tomotwin_t12/lib/python3.9/site-packages/torch2trt-0.4.0-py3.9.egg/torch2trt/torch2trt.py", line 192, in broadcast_trt_tensors
    if len(t.shape) < broadcast_ndim:
ValueError: __len__() should return >= 0

Here is the code: https://github.com/MPI-Dortmund/tomotwin-cryoet/blob/tensorrt/tomotwin/modules/inference/embedor.py#L462

Is group_norm not supported somehow?

I'm using the current master of torch2trt.

Best, Thorsten

kishcs commented 1 year ago

I am also facing same issue with a different model (https://github.com/clovaai/deep-text-recognition-benchmark) while converting to TRT.

braj29 commented 1 year ago

Did anyone find a solution ? Please help me i am facing the same issue.