Open nishanthballal-9 opened 2 years ago
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.
did you find a solution for this? It seems, that it has something to do with the inputs that were marked as optional in the source model structure.
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.