tensorflow / models

Models and examples built with TensorFlow
Other
77.17k stars 45.76k forks source link

Accuracy loss of converted tf-lite model #9143

Open ivs-pychen opened 4 years ago

ivs-pychen commented 4 years ago

I use the export_tflite_ssd_graph.py to convert the model to TFLite compatible graph(.pb file). Then, I use tflite_convert to convert the .pb file to tf-lite format. Is there any accuracy loss of the tf-lite model? The following is the command : tflite_convert \ --graph_def_file=/home/ubuntu/ssd_mobilenet_v2/tflite_graph.pb \ --output_file=/home/ubuntu/ssd_mobilenet_v2/model.tflite \ --output_format=TFLITE \ --input_arrays=normalized_input_image_tensor \ --input_shapes=1,300,300,3 \ --inference_type=FLOAT \ --output_arrays="TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3" \ --allow_custom_ops

Can I use the TFLite compatible graph(.pb file) to calculate the mAP like frozen_inference_graph.pb? Is the mAP result of TFLite compatible graph(.pb file) the same as the tflite model?

DLMasterCat commented 4 years ago

similar question After I converted it into tflite format with Qunantized UINT8 the output would always become the same, which leads to accuracy drop