I have managed to run an inference with maskrccn onnx model with an image from coco dataset with C++.
But, now, I am not able to extract bbox, labels, scores and mask from output_tensor. I tried to check to /onnxruntime-inference-examples/c_cxx/ repo, but I cannot solve it. Instead, in Python, it is pretty easy, but I must do it in c++.
Describe the issue
Hi everyone,
I have managed to run an inference with maskrccn onnx model with an image from coco dataset with C++. But, now, I am not able to extract bbox, labels, scores and mask from output_tensor. I tried to check to /onnxruntime-inference-examples/c_cxx/ repo, but I cannot solve it. Instead, in Python, it is pretty easy, but I must do it in c++.
Code:
I am getting these shapes: 13 - 4 13 - 0 13 - 0 13 - 1
But, in python, I am getting that there is 14 classes.
Can anyone give me a hand?
Thanks
To reproduce
Ubuntu 22.04 onnxruntime 1.14.1 CUDA 11.7 Onnx makrcnn model
Urgency
No response
Platform
Linux
OS Version
22.04
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.14.1
ONNX Runtime API
C++
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
CUDA 11.7