Closed Sukeysun closed 4 months ago
docker run -it --rm \
-v `pwd`:/home/user/workdir \
ghcr.io/pinto0309/tflite2tensorflow:latest
tflite2tensorflow \
--model_path ./pose_detector.tflite \
--flatc_path ../flatc \
--schema_path ../schema.fbs \
--output_pb
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ op: RESHAPE
{'builtin_options_type': 'NONE',
'custom_options_format': 'FLEXBUFFERS',
'inputs': [439, 392],
'opcode_index': 8,
'outputs': [440]}
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ op: CONCATENATION
{'builtin_options': {'axis': 1, 'fused_activation_function': 'NONE'},
'builtin_options_type': 'ConcatenationOptions',
'custom_options_format': 'FLEXBUFFERS',
'inputs': [393, 416, 440],
'opcode_index': 9,
'outputs': [441]}
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
outputs:
{'dtype': <class 'numpy.float32'>,
'index': 441,
'name': 'Identity',
'quantization': (0.0, 0),
'quantization_parameters': {'quantized_dimension': 0,
'scales': array([], dtype=float32),
'zero_points': array([], dtype=int32)},
'shape': array([ 1, 2254, 12], dtype=int32),
'shape_signature': array([ 1, 2254, 12], dtype=int32),
'sparsity_parameters': {}}
{'dtype': <class 'numpy.float32'>,
'index': 429,
'name': 'Identity_1',
'quantization': (0.0, 0),
'quantization_parameters': {'quantized_dimension': 0,
'scales': array([], dtype=float32),
'zero_points': array([], dtype=int32)},
'shape': array([ 1, 2254, 1], dtype=int32),
'shape_signature': array([ 1, 2254, 1], dtype=int32),
'sparsity_parameters': {}}
TensorFlow/Keras model building process complete!
saved_model / .pb output started ====================================================
saved_model / .pb output complete!
saved_model_cli show --dir saved_model/ --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input_1'] tensor_info:
dtype: DT_FLOAT
shape: (1, 224, 224, 3)
name: input_1:0
The given SavedModel SignatureDef contains the following output(s):
outputs['Identity'] tensor_info:
dtype: DT_FLOAT
shape: (1, 2254, 12)
name: Identity:0
outputs['Identity_1'] tensor_info:
dtype: DT_FLOAT
shape: (1, 2254, 1)
name: Identity_1:0
Method name is: tensorflow/serving/predict
It's hard to pull down the Docker container.
python -m tf2onnx.convert \
--opset 11 \
--tflite pose_landmarks_detector.tflite \
--output pose_landmarks_detector.onnx \
--inputs-as-nchw input_1 \
--dequantize
onnxsim pose_landmarks_detector.onnx pose_landmarks_detector.onnx
Issue Type
Bug
OS
Ubuntu
OS architecture
x86_64
Programming Language
Python
Framework
ONNX
Download URL for tflite file
https://storage.googleapis.com/mediapipe-models/pose_landmarker/pose_landmarker_full/float16/latest/pose_landmarker_full.task unzip pose_landmarker_full.task
Convert Script
Description
I want to convert the pose_detector.tflite file to onnx file. I follow the step in readme: Step 1 : Generating saved_model and FreezeGraph (.pb)
Relevant Log Output