Closed xandox closed 3 years ago
The tool is designed to optimize the model to the limit for low-spec edge devices, so there are no plans to support variable resolution or variable batches at this time. This may be implemented in the future.
Hi! Is it still impossible to convert model with variable batch size from OV to TF?
OpenVINO -> ONNX https://github.com/PINTO0309/openvino2tensorflow
ONNX ->sbi4onnx -> N batch ONNX https://github.com/PINTO0309/simple-onnx-processing-tools
N batch ONNX -> TF
$ onnx-tf convert -i mobileone_s0_Nx3xHxW.onnx -o saved_model
/usr/local/lib/python3.8/dist-packages/tensorflow_addons/utils/ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.3.0 and strictly below 2.5.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.9.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( 2022-07-29 20:04:51,012 - onnx-tf - INFO - Start converting onnx pb to tf saved model WARNING:absl:Found untraced functions such as gen_tensor_dict while saving (showing 1 of 1). These functions will not be directly callable after loading. 2022-07-29 20:05:07,449 - onnx-tf - INFO - Converting completes successfully. INFO:onnx-tf:Converting completes successfully.
$ saved_model_cli show --dir saved_model/ --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']: The given SavedModel SignatureDef contains the following input(s): The given SavedModel SignatureDef contains the following output(s): outputs['__saved_model_init_op'] tensor_info: dtype: DT_INVALID shape: unknown_rank name: NoOp Method name is:
signature_def['serving_default']: The given SavedModel SignatureDef contains the following input(s): inputs['input'] tensor_info: dtype: DT_FLOAT shape: (-1, 3, -1, -1) name: serving_default_input:0 The given SavedModel SignatureDef contains the following output(s): outputs['184'] tensor_info: dtype: DT_FLOAT shape: (-1, 1000) name: StatefulPartitionedCall:0 Method name is: tensorflow/serving/predict
Concrete Functions: Function Name: 'call' Named Argument #1 input
Function Name: 'gen_tensor_dict'
Hi. The openvino support resizable models. You can change input shape and outputs shape will change too. Is it possible convert such models?