Closed luannd closed 3 years ago
Here are the inference results in tflite, converted by my tool. There doesn't seem to be any particular problem, so if there is a problem with the inference results when converting to tfjs, it may be a bug in tfjsconverter.
This is the model of CUP that I converted. saved_model.zip
Closed due to lack of progress.
1. OS you are using: Mac OS
2. OS Architecture: Mac OS
3. Version of OpenVINO: same as your Docker
4. Version of TensorFlow: same as your Docker
5. Version of TensorRT: same as your Docker
6. Version of TFJS: 2.7.0
7. Version of coremltools: same as your Docker
8. Version of ONNX: same as your Docker
9. Download URL for .tflite IR model: directly from Mediapipe repo: https://github.com/google/mediapipe/tree/master/mediapipe/models
10. URL of the repository from which the transformed model was taken: use your Docker command
11. URL or source code for simple inference testing code
12. Issue Details
I can convert the tflite models to tfjs but they seem to predict wrong results while testing by tfjs. The 9 points are kind of random and cannot be detected [correctly.] My preprocessing is the same as https://github.com/sfsrd/objectron. Not sure what's wrong in the conversion pipeline.