Closed Eddudos closed 2 months ago
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 480, 480, 3)
name: serving_default_input:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output_0'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 19)
name: PartitionedCall:0
Method name is: tensorflow/serving/predict
Thank you, it simply works now!
Issue Type
Others
OS
Linux
onnx2tf version number
1.25.9
onnx version number
1.16.1
onnxruntime version number
1.18.1
onnxsim (onnx_simplifier) version number
0.4.33
tensorflow version number
2.17.0
Download URL for ONNX
https://drive.google.com/file/d/1p1NK1Y5AZi2jpKJ8SumLZ-3WSSEGJb5u/view?usp=sharing
Parameter Replacement JSON
Description
I've converted ONNX to tf as:
!onnx2tf -i weights/onnx/efficientnet_v2_m.onnx \ -o weights/tf/efficientnet_v2_m --non_verbose
My saved_model.pb:
I see there is constant output size of 1 instead of -1. I've also tried to inference on tf model on batch of images with shape (3, 480, 480, 3) and got
But it works correctly with a single image (1, 480, 480, 3)