lix19937 / tensorrt-insight

Deep insight tensorrt, including but not limited to qat, ptq, plugin, triton_inference, cuda
12 stars 0 forks source link

RuntimeError: input_shape_value == reshape_value || input_shape_value == 1 || reshape_value == 1 INTERNAL ASSERT FAILED at "../torch/ #47

Open lix19937 opened 1 month ago

lix19937 commented 1 month ago
/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/_internal/jit_utils.py:258: UserWarning: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function. (Triggered internally at ../torch/csrc/jit/passes/onnx/shape_type_inference.cpp:1884.)
  _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
Traceback (most recent call last):
  File "./tools/test.py", line 371, in <module>
    main()
  File "./tools/test.py", line 320, in main
    torch.onnx.export(
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/utils.py", line 504, in export
    _export(
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/utils.py", line 1529, in _export
    graph, params_dict, torch_out = _model_to_graph(
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/utils.py", line 1115, in _model_to_graph
    graph = _optimize_graph(
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/utils.py", line 663, in _optimize_graph
    graph = _C._jit_pass_onnx(graph, operator_export_type)
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/utils.py", line 1899, in _run_symbolic_function
    return symbolic_fn(graph_context, *inputs, **attrs)
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/symbolic_helper.py", line 380, in wrapper
    return fn(g, *args, **kwargs)
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/symbolic_opset9.py", line 901, in expand_as
    return g.op("Expand", self, shape)
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/_internal/jit_utils.py", line 85, in op
    return _add_op(self, opname, *raw_args, outputs=outputs, **kwargs)
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/_internal/jit_utils.py", line 197, in _add_op
    node = _create_node(
  File "/home/lix/anaconda3/envs/torch113/lib/python3.8/site-packages/torch/onnx/_internal/jit_utils.py", line 258, in _create_node
    _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
RuntimeError: input_shape_value == reshape_value || input_shape_value == 1 || reshape_value == 1 INTERNAL ASSERT FAILED at "../torch/csrc/jit/passes/onnx/shape_type_inference.cpp":554, please report a bug to PyTorch. ONNX Expand input shape constraint not satisfied.

torch.onnx.export error.