DerryHub / BEVFormer_tensorrt

BEVFormer inference on TensorRT, including INT8 Quantization and Custom TensorRT Plugins (float/half/half2/int8).
Apache License 2.0
430 stars 71 forks source link

torch.onnx.export transform bevformer to onnx occure error aten::mul not support #26

Closed zxros10 closed 1 year ago

zxros10 commented 1 year ago

when i train model from https://github.com/fundamentalvision/BEVFormer, and transform to onnx,occur error: /home/xjw/Desktop/cyp/bxie/BevFormer/BEVFormer/tools/onnx_utils.py", line 74, in get_onnx_model torch.onnx.export( File "/root/miniconda3/lib/python3.8/site-packages/torch/onnx/init.py", line 350, in export return utils.export( File "/root/miniconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 163, in export _export( File "/root/miniconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 1074, in _export graph, params_dict, torch_out = _model_to_graph( File "/root/miniconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 731, in _model_to_graph graph = _optimize_graph( File "/root/miniconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 249, in _optimize_graph _C._jit_pass_canonicalize_graph_fuser_ops(graph) RuntimeError: 0 INTERNAL ASSERT FAILED at "../torch/csrc/jit/ir/alias_analysis.cpp":608, please report a bug to PyTorch. We don't have an op for aten::mul but it isn't a special case. Argument types: Tensor, bool,

bevformer transform to tensorrt also need onnx, did you encounter this error ? How did you resolve it? Thanks!

DerryHub commented 1 year ago

Sorry, I haven't tried the code BevFormer/BEVFormer/tools/onnx_utils.py to transform bevformer to onnx.

There are lots of Ops unsupported by ONNX and TensorRT in the original forward. And I changed all unsupported operators. BEVFormer_tensorrt supports the official PyTorch models.