I use the export_tflite_ssd_graph.py to convert the model to TFLite compatible graph(.pb file). Then, I use tflite_convert to convert the .pb file to tf-lite format. Is there any accuracy loss of the tf-lite model?
The following is the command :
tflite_convert \
--graph_def_file=/home/ubuntu/ssd_mobilenet_v2/tflite_graph.pb \
--output_file=/home/ubuntu/ssd_mobilenet_v2/model.tflite \
--output_format=TFLITE \
--input_arrays=normalized_input_image_tensor \
--input_shapes=1,300,300,3 \
--inference_type=FLOAT \
--output_arrays="TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3" \
--allow_custom_ops
Can I use the TFLite compatible graph(.pb file) to calculate the mAP like frozen_inference_graph.pb?
Is the mAP result of TFLite compatible graph(.pb file) the same as the tflite model?
I use the export_tflite_ssd_graph.py to convert the model to TFLite compatible graph(.pb file). Then, I use tflite_convert to convert the .pb file to tf-lite format. Is there any accuracy loss of the tf-lite model? The following is the command : tflite_convert \ --graph_def_file=/home/ubuntu/ssd_mobilenet_v2/tflite_graph.pb \ --output_file=/home/ubuntu/ssd_mobilenet_v2/model.tflite \ --output_format=TFLITE \ --input_arrays=normalized_input_image_tensor \ --input_shapes=1,300,300,3 \ --inference_type=FLOAT \ --output_arrays="TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3" \ --allow_custom_ops
Can I use the TFLite compatible graph(.pb file) to calculate the mAP like frozen_inference_graph.pb? Is the mAP result of TFLite compatible graph(.pb file) the same as the tflite model?