PINTO0309 / tflite2tensorflow

Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite. Support for building environments with Docker. It is possible to directly access the host PC GUI and the camera to verify the operation. NVIDIA GPU (dGPU) support. Intel iHD GPU (iGPU) support. Supports inverse quantization of INT8 quantization model.
https://qiita.com/PINTO
MIT License
258 stars 38 forks source link

Use docker but no *.json error #37

Closed imjking closed 11 months ago

imjking commented 11 months ago

Issue Type

Bug

OS

Ubuntu

OS architecture

x86_64

Programming Language

C++, Python

Framework

ONNX, TensorFlow, TensorFlowLite

Download URL for tflite file

https://github.com/google/mediapipe/blob/master/mediapipe/modules/hand_landmark/hand_landmark_full.tflite

Convert Script

tflite2tensorflow --model_path hand_landmark_full.tflite --flatc_path ../flatc --schema_path ../schema.fbs --output_pb

Description

hi, i saw this issue: https://github.com/PINTO0309/tflite2tensorflow/issues/33, and i use docker. want to convert model from mediapipe tflite to pd and quantify the model, but show FileNotFoundError: [Errno 2] No such file or directory: './hand_landmark_full.json' error, what's the problom?

Relevant Log Output

user@90fbb8a901d3:~/workdir$ tflite2tensorflow --model_path hand_landmark_full.tflite --flatc_path ../flatc --schema_path ../schema.fbs --output_pb
output json command = ../flatc -t --strict-json --defaults-json -o . ../schema.fbs -- hand_landmark_full.tflite
../flatc: error: Unable to generate text for hand_landmark_full
Usage: ../flatc [OPTION]... FILE... [-- FILE...]

FILEs may be schemas (must end in .fbs), binary schemas (must end in .bfbs),
or JSON files (conforming to preceding schema). FILEs after the -- must be
binary flatbuffer format files.
Output files are named using the base file name of the input,
and written to the current directory or the path given by -o.
example: ../flatc -c -b schema1.fbs schema2.fbs data.json
Traceback (most recent call last):
  File "/usr/local/bin/tflite2tensorflow", line 6614, in <module>
    main()
  File "/usr/local/bin/tflite2tensorflow", line 5861, in main
    ops, json_tensor_details, op_types, full_json = parse_json(jsonfile_path)
  File "/usr/local/bin/tflite2tensorflow", line 247, in parse_json
    j = json.load(open(jsonfile_path))
FileNotFoundError: [Errno 2] No such file or directory: './hand_landmark_full.json'

Source code for simple inference testing code

No response

PINTO0309 commented 11 months ago

image

wget https://storage.googleapis.com/mediapipe-assets/hand_landmark_full.tflite

tflite2tensorflow \
--model_path hand_landmark_full.tflite \
--flatc_path ../flatc \
--schema_path ../schema.fbs \
--output_pb
imjking commented 11 months ago

thanks for your reply. i tried your response, but still the same error. by the way, the url i pasted doesn't contain the tflite model, but i really got the tflite model previously, maybe the official mediapipe repository delete it.

PINTO0309 commented 11 months ago

I can only assume that the command you entered is wrong. Therefore, there is nothing more that I can specifically advise you on. If you are using Docker, it cannot be considered an environmental issue. I guess you don't have a .tflite file.

wget https://storage.googleapis.com/mediapipe-assets/hand_landmark_full.tflite

tflite2tensorflow \
--model_path hand_landmark_full.tflite \
--flatc_path ../flatc \
--schema_path ../schema.fbs \
--output_pb

tflite2tensorflow \
--model_path hand_landmark_full.tflite \
--flatc_path ../flatc \
--schema_path ../schema.fbs \
--output_no_quant_float32_tflite

image

imjking commented 11 months ago

ok thank you. could it be due to the version of flatbuffer? in docker it's 1.12.0, in the readme it's 2.0.8?

PINTO0309 commented 11 months ago

You don't have to worry about anything in particular, just execute the command I suggested above once you start Docker.

imjking commented 11 months ago

ok, succeed now. need to add "sudo" command. thank you.