Open alexander-sony opened 2 years ago
Could you use the deeplap v2 TF2 version? I am afraid that the deeplab research team may not answer TF1 codebase questions.
Could you use the deeplap v2 TF2 version? I am afraid that the deeplab research team may not answer TF1 codebase questions.
Not really since it's not been tested with TFLite. I want to use semantic segmentation on a DSP and need a (~300x300) quantized TFLite model.
I still do not understand why deeplab/export_model.py fails when used in a colab notebook. However, have instead successfully used deeplab/local_test.py using TF1.15 running from docker. local_test.sh is running all the essential scripts; test, training evaluation and export.
Reopened the issue as per above comments.
Prerequisites
Please answer the following questions for yourself before submitting an issue.
1. The entire URL of the file you are using
https://github.com/tensorflow/models/tree/master/research/deeplab/export_model.py
2. Describe the bug
export_model.py fails exporting one of the models in model zoo, http://download.tensorflow.org/models/deeplabv3_mnv2_dm05_pascal_trainaug_2018_10_01.tar.gz
3. Steps to reproduce
see colab, https://colab.research.google.com/drive/1eUukOqChFFLPZa7TwUOcXZguvJINTMqy?usp=sharing
4. Expected behavior
frozen_inference_graph.pb
5. Additional context
6. System information
google colab notebook TF 1.5.2