khanhlvg / TFLiteDemo

Sample app to showcase several TensorFlow Lite models
Apache License 2.0
5 stars 0 forks source link

Segmentation with Deeplabv3 Cityscape Model #2

Open titanbender opened 4 years ago

titanbender commented 4 years ago

Dear Khanh,

Thanks for sharing your sample code! The android and colab code are very useful!

My question is related to the configuration to other segmentation models.

More specifically, I’d like to test with the deeplabv3 model on cityscapes data and have identified the following model on TF model Zoo mobilenetv3_small_cityscapes_trainfine.

To convert the model to TF lite, I’ve used the following command:

tflite_convert \
  --output_file=./deeplabv3_city.tflite \
  --graph_def_file=./frozen_inference_graph.pb \
  --input_arrays=ImageTensor \
  --output_arrays=ExpandDims_1 \
  --input_shapes=1,257,257,3 \
  --inference_input_type=QUANTIZED_UINT8 \
  --inference_type=FLOAT \
  --mean_values=128 \
  --std_dev_values=127

My question is two-fold:

  1. When testing the converted model with your Colab, I get a ‘set tensor error’ with the following message Got tensor of type 0 but expected type 3 for input 3 regardless of the data type that I input (uint8, float32, int32, etc.) Do you have an intuition in terms of where the error could lie?


  2. The goal would be to run the model on android with your sample code on segmentation. Do you have any advice on how the sample code should to be modified in order to run a different model like this?

Thank you for your time.

Sincerely, Johan

khanhlvg commented 4 years ago

I looked at the model you want to use but I saw some non-trivial errors running the model. Deeplab v3 requires TF 1 but TF Lite have stopped development on TF 1 completely and moved to TF 2 so there are some hacks you need to do to convert properly.

  1. Use TF 1.15 to export pretrained model as SavedModel format. Change this export script to export to SavedModel format instead of frozen graph format. You can see an example here. Make sure to use fixed size input instead of dynamic (aka. shape = None) input.
  2. Use TF 2 TFLiteConverter to convert the saved model to TF Lite.
  3. Use TF 2 TF Lite Interpreter Python API to test if it works.

Please try the steps and if you see errors, please share the reproducible steps as a Colab notebook and I'll see if I can help you troubleshoot.