ChuRuaNh0 / FastSam_Awsome_TensorRT

108 stars 11 forks source link

Ask for solution about shape_type_inference when converting .pt to .onnx file #14

Open LeungWaiHo opened 9 months ago

LeungWaiHo commented 9 months ago

Error Information: RuntimeError: THPVariable_Check(tuple_elem) INTERNAL ASSERT FAILED at "/opt/conda/conda-bld/pytorch_1614378073850/work/torch/csrc/jit/passes/onnx/shape_type_inference.cpp":676, please report a bug to PyTorch.

ChuRuaNh0 commented 9 months ago

@LeungWaiHo You have to check your onnx version which should be 1.8.0

LeungWaiHo commented 9 months ago

@LeungWaiHo You have to check your onnx version which should be 1.8.0

Yes, my onnx version is 1.8.0 and onnxruntime-gpu version is 1.9.0, but aboved error still appears. Could you provide your onnx and tensorRT weights? Thanks!

ChuRuaNh0 commented 9 months ago

@LeungWaiHo I can provide onnx, but TensorRT can only run own venv that converted, not run in any other venv. link onnx: https://drive.google.com/file/d/1Ebs5FajByle0h_t5aegrgeq1PwPMVhf2/view?usp=sharing You need convert onnx to tensorRT by exec that I mentioned in this repo.