Closed Tom-plus closed 10 months ago
So why not exporting a model with post-processing step as shown in Models Export?
Please note that the way you are exporting model is not wrong, but it is too "manual":
model.eval()
model.prep_model_for_conversion(input_size=[1, 3, 320, 320])
dummy_input = torch.randn([1, 3, 320, 320], device="cpu")
torch.onnx.export(model, dummy_input, "yolo_nas_s.onnx", opset_version=11)
We offer model.export()
that does all above and much more including optional pre- and post-processing, quantization and other features. Please refer to Models Export
If you want to post-process predictions by hand you can use PPYoloEPostPredictionCallback for this.
💡 Your Question
I generate onnx using the following code:
Then, I inferd onnx and obtained two outputs, one with shape of (1, 2100 80) and other with the shape of (1, 2100, 4) which I assume the shapes of scores and boxes respectively.
I saw in the models_export_pose.md(https://github.com/Deci-AI/super-gradients/blob/master/documentation/source/models_export_pose.md ) that there are four inference output results
but my output is only two. How can I obtain confidence and other related information from those two outputs?
Versions
Versions of relevant libraries: [pip3] numpy==1.23.0 [pip3] onnx==1.13.0 [pip3] onnx-simplifier==0.4.35 [pip3] onnxruntime==1.13.1 [pip3] onnxsim==0.4.35 [pip3] torch==1.13.1+cpu [pip3] torchaudio==0.13.1+cpu [pip3] torchmetrics==0.8.0 [pip3] torchvision==0.14.1+cpu [conda] blas 1.0 mkl
[conda] mkl 2021.2.0 h06a4308_296
[conda] mkl-service 2.3.0 py38h27cfd23_1
[conda] mkl_fft 1.3.0 py38h42c9631_2
[conda] mkl_random 1.2.1 py38ha9443f7_2
[conda] numpy 1.20.1 py38h93e21f0_0
[conda] numpy-base 1.20.1 py38h7d8b39e_0
[conda] numpydoc 1.1.0 pyhd3eb1b0_1