Open uelordi01 opened 6 years ago
I have a similar problem! 2018-08-07 13:21:07.345473: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1053] Converting unsupported operation: SquaredDifference 2018-08-07 13:21:07.361241: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1053] Converting unsupported operation: TFLite_Detection_PostProcess
same problem in Android. I've tried first variant but have an error like "Cannot create interpreter: Didn't find custom op for name 'TFLite_Detection_PostProcess'". Then I paid attention that there were some notifications during toco
converting, like "Converting unsupported operation: TFLite_Detection_PostProcess". But tflite file was created anyway. Then I've tried to generate tflite without any warnings, and I just can't. I also have tried your second variant, but it causes error " Output array not found: detection_boxes". Well, at least it doesn't generate invalid file
P.S. I used ssd_mobilenet_v1_coco
pretrained model
If the tflite file was created, it is valid. Does the file run in the app? FYI, the warning converting unsupported operation is there because the postprocessing op is a custom TFLite op (not the regular Tensorflow op)
@achowdhery the file does not run due to " Cannot create interpreter: Didn't find custom op for name 'TFLite_Detection_PostProcess'" exception
@kate-kate I think this is a version issue where you dont have the correct version of Tensorflow which has the op available. Please provide details on how you installed Tensorflow. Is this from within ML Kit?
@achowdhery I installed tensorflow from source just like it is described in the tensorflow site
Here is tf_env_collect.sh
result, maybe it will help
== cat /etc/issue ===============================================
Darwin MacBook-Pro-Ekaterina.local 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64
Mac OS X 10.13.6
== are we in docker =============================================
No
== compiler =====================================================
Apple LLVM version 9.1.0 (clang-902.0.39.2)
Target: x86_64-apple-darwin17.7.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
== uname -a =====================================================
Darwin MacBook-Pro-Ekaterina.local 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64
== check pips ===================================================
numpy 1.14.5
numpydoc 0.7.0
protobuf 3.6.0
tensorflow 1.10.0
== check for virtualenv =========================================
False
== tensorflow import ============================================
tf.VERSION = 1.10.0
tf.GIT_VERSION = v1.10.0-rc1-19-g656e7a2b34
tf.COMPILER_VERSION = v1.10.0-rc1-19-g656e7a2b34
Sanity check: array([1], dtype=int32)
== env ==========================================================
LD_LIBRARY_PATH is unset
DYLD_LIBRARY_PATH is unset
== nvidia-smi ===================================================
tf_env_collect.sh: line 105: nvidia-smi: command not found
== cuda libs ===================================================
I am having a similar issue here but my tflite model works on Android demo only that it returns many false positive detections.
I am not sure if this toco log suggests the problem tensorflow/lite/toco/import_tensorflow.cc:1332] Converting unsupported operation: TFLite_Detection_PostProcess
@achowdhery any update regarding this issue ? i am facing the same problem when converting a custom trained model to tflite .
I am facing the same problem did anyone fix?
I am having same issues any updates ??
System information
What is the top-level directory of the model you are using:
Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux ubuntu 16.04
TensorFlow installed from binary (CPU):
TensorFlow version (use command below):
Describe the problem
Hi:
I am trying to export ssdlite_Mobilenet_v2 model to tf-lite: I downloaded from this path I tried with these two ways:
1.- following this tutorial with export_tflite_ssd_graph.py, using the checkpoint
I use export_tflite_ssd_graph.py like this:
and I got this errors:
So I added allow-custom-ops, and when I use the tf-lite interperter in android:
And as I supposed I have custom operation error.
2.- using toco with the pretrained model.
my toco script is:
As I have some errors with the operators I added allow-custom-ops, and when I use the tf-lite interperter in android:
But as I supposed I have custom operation error.
So my questions are: