PINTO0309 / PINTO_model_zoo

A repository for storing models that have been inter-converted between various frameworks. Supported frameworks are TensorFlow, PyTorch, ONNX, OpenVINO, TFJS, TFTRT, TensorFlowLite (Float32/16/INT8), EdgeTPU, CoreML.
https://qiita.com/PINTO
MIT License
3.42k stars 558 forks source link

About CRestereo model to tensorrt! #249

Closed sunmooncode closed 2 years ago

sunmooncode commented 2 years ago

Issue Type

Others

OS

Ubuntu

OS architecture

x86_64

Programming Language

C++

Framework

TensorRT

Model name and Weights/Checkpoints URL

CREstereo onnx combine model

Description

When I use onnx-tensorrt to convert the combined model, I get the following error:

[2022-04-14 03:45:33   ERROR] [layers.cpp::EinsumLayer::5525] Error Code 3: API Usage Error (Parameter check failed at: optimizer/api/layers.cpp::EinsumLayer::5525, condition: nbInputs > 0 && nbInputs <= MAX_EINSUM_NB_INPUTS
)
While parsing node number 113 [Einsum -> "onnx::Mul_589"]:
ERROR: builtin_op_importers.cpp:1336 In function importEinsum:
[8] Assertion failed: layer_ptr && "Input layer is null."

I see here that you successfully converted the model, could you help me a little!

Relevant Log Output

No response

URL or source code for simple inference testing code

No response

PINTO0309 commented 2 years ago
sunmooncode commented 2 years ago

I don't know much about tensorrt. This is the direct use of the onnx model for inference?

PINTO0309 commented 2 years ago

https://onnxruntime.ai/docs/execution-providers/TensorRT-ExecutionProvider.html

https://github.com/PINTO0309/openvino2tensorflow#4-1-environment-construction-pattern-1-execution-by-docker-strongly-recommended

sunmooncode commented 2 years ago

@PINTO0309 Thanks bro! i solved it

SolDogLi commented 1 year ago

@PINTO0309 Thanks bro! i solved it I have the same problem, how did you solve it?