PINTO0309 / tflite2tensorflow

Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite. Support for building environments with Docker. It is possible to directly access the host PC GUI and the camera to verify the operation. NVIDIA GPU (dGPU) support. Intel iHD GPU (iGPU) support. Supports inverse quantization of INT8 quantization model.
https://qiita.com/PINTO
MIT License
258 stars 38 forks source link

KeyError: 'operator_codes' #16

Closed mtyeager closed 2 years ago

mtyeager commented 2 years ago

Running tflite2tensorflow in docker container, against tflite model generated in the AutoML Vision service in google cloud, produces:

sh-5.0$ tflite2tensorflow \

--model_path model.tflite \ --flatc_path ../flatc \ --schema_path ../schema.fbs \ --output_pb \ --optimizing_for_openvino_and_myriad Traceback (most recent call last): File "/usr/local/bin/tflite2tensorflow", line 6201, in main() File "/usr/local/bin/tflite2tensorflow", line 5592, in main ops, json_tensor_details, op_types, full_json = parse_json(jsonfile_path) File "/usr/local/bin/tflite2tensorflow", line 265, in parse_json op_types = [v['builtin_code'] for v in j['operator_codes']] KeyError: 'operator_codes'

The tflite output from AutoML Vision was three files: model.tflite, tflite_metadata.json, and dict.txt (which contains the list of labels).

I first received the error that it could not find model.json, so I renamed tflite_metadata.json to model.json and thus the error above.

Here is contents of model.json (tflite_metadata.json) - obviously it does not include operator codes.

{ "inferenceType": "QUANTIZED_UINT8", "inputShape": [ 1, 320, 320, 3 ], "inputTensor": "normalized_input_image_tensor", "maxDetections": 40, "outputTensorRepresentation": [ "bounding_boxes", "class_labels", "class_confidences", "num_of_boxes" ], "outputTensors": [ "TFLite_Detection_PostProcess", "TFLite_Detection_PostProcess:1", "TFLite_Detection_PostProcess:2", "TFLite_Detection_PostProcess:3" ] }

Is tflite2tensorflow expecting a different set of files?

PINTO0309 commented 2 years ago

If it is difficult for you to make your model public, you can share the tflite file with me via DM. I have no idea what the problem is with the information you have provided.

PINTO0309 commented 2 years ago

There were three reasons why you failed to convert tflite.

  1. It is expected that the Docker environment is not used and all necessary modules are not installed.
  2. Due to the special nature of AutoML's Reshape operation, it is not possible to identify the shape that needs to be changed.
  3. Cannot convert to blob because sort_result_descending of NMS is true.

All of the above issues have been resolved in v1.13.4. However, the AutoML model is so complex that I had to wait 2 hours and 30 minutes for it to convert to Myriad Blob. The converted blobs have already been shared with you via DM. If everything is ok, close the issue.

mtyeager commented 2 years ago

I am unsure about #1. I have downloaded the latest docker v1.13.4, but when I run, I get the same error.

sh-5.0$ tflite2tensorflow \

--model_path model.tflite \ --flatc_path ../flatc \ --schema_path ../schema.fbs \ --output_pb \ --optimizing_for_openvino_and_myriad Traceback (most recent call last): File "/usr/local/bin/tflite2tensorflow", line 6220, in main() File "/usr/local/bin/tflite2tensorflow", line 5598, in main ops, json_tensor_details, op_types, full_json = parse_json(jsonfile_path) File "/usr/local/bin/tflite2tensorflow", line 265, in parse_json op_types = [v['builtin_code'] for v in j['operator_codes']] KeyError: 'operator_codes'.

Thanks again.

mtyeager commented 2 years ago

I think the only issue was permissions. It ran under sudo no problem.