tensorflow / models

Models and examples built with TensorFlow
Other
77k stars 45.78k forks source link

Op type not registered 'TFLite_Detection_PostProcess' in binary for TF 1.13.1 #6753

Closed sofeikov closed 5 years ago

sofeikov commented 5 years ago

Please go to Stack Overflow for help and support:

http://stackoverflow.com/questions/tagged/tensorflow

Also, please understand that many of the models included in this repository are experimental and research-style code. If you open a GitHub issue, here is our policy:

  1. It must be a bug, a feature request, or a significant problem with documentation (for small docs fixes please send a PR instead).
  2. The form below must be filled out.

Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.


System information

You can collect some of this information using our environment capture script:

https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh

You can obtain the TensorFlow version with

python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"

Describe the problem

I tried to use object detection notebook example provided in the examples section. I tried to download ssd_mobilenet_v2_quantized_coco. However, when I try to import and run run_inference_for_single_image function, the whole things fail with the following error:

tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'TFLite_Detection_PostProcess' in binary running on

I know there were problems like this one. However, in previous issues it helped people to update their TF to 1.12. My version is 1.13, so the problem should not be there. What are potential reasons for this problem in 1.13?

Source code / logs

na

ksaurabh-cadence commented 5 years ago

I am also facing a similar issue. After freezing the model ssd_mobilenet_v2_coco_2018_03_29 with export_tflite_ssd_graph.py, I am simply trying to load the tensorflow graph and encounter the error tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'TFLite_Detection_PostProcess' in binary.

My platform is Windows and I tried with both Tensorflow 1.13 and 1.11 but it's the same error.

I am not sure if this op type is registered at all with regular tensorflow.

passion3394 commented 5 years ago

I have the same error when I try to inference tflite_graph.pb which is produced by export_tflite_ssd_graph.py. I think it's a common problem, can tensorflowers help?

passion3394 commented 5 years ago

@bmabey @alextp

sofeikov commented 5 years ago

Hi guys,

I've resolved this issue for myself. I might be using wrong terminology, but here is what happens: this TFLite_Detection_postprocess operation does not exist in Tensorflow runtime, it is simply not registered there. Therefore, there is no simple way to get it working in Tensorflow.

However, this operation is registered in TensorflowLite runtime. So the way forward here is to convert your qunatized pb file to tflite format and inference it from there. As long as you load tflite model, it will work like a charm.

Here is a description of how to do it: https://www.tensorflow.org/lite/convert/python_api#using_the_interpreter_from_a_model_file_

I've also attaching zipped tflite model, just so you could use it.

my.tflite.tar.gz

and then you would inference it just like it is the described in the link above. I'm closing this issue.

ksaurabh-cadence commented 5 years ago

@sofeikov I am not trying to inference the TFLite model, but trying to convert it to ONNX format with TFLite_Detction_postprocess as a custom op. Since TFtoONNX convertor uses tensorflow, but not TFLite I wish to not convert to TFlite format.

passion3394 commented 5 years ago

@ksaurabh-cadence I think google didn't open the details of the op TFLite_Detction_postprocess, so it's hard to implement this op in other framework.

maneshsyno commented 5 years ago

Hi guys,

I've resolved this issue for myself. I might be using wrong terminology, but here is what happens: this TFLite_Detection_postprocess operation does not exist in Tensorflow runtime, it is simply not registered there. Therefore, there is no simple way to get it working in Tensorflow.

However, this operation is registered in TensorflowLite runtime. So the way forward here is to convert your qunatized pb file to tflite format and inference it from there. As long as you load tflite model, it will work like a charm.

Here is a description of how to do it: https://www.tensorflow.org/lite/convert/python_api#using_the_interpreter_from_a_model_file_

I've also attaching zipped tflite model, just so you could use it.

my.tflite.tar.gz

and then you would inference it just like it is the described in the link above. I'm closing this issue.

Hi @drsealks, Can you give the exact code or command you used to convert .bp file to my.tflite. Also if you can tell me which model you used ssd_mobilenet_v1_coco or ssd_mobilenet_v2_coco?