Open rafiuddinkhan opened 4 years ago
I trained the model with keras, then convert it to tflite and other frameworks' model. I also trained the model with tensorflow object detection api and convert to tflite, however, when I quantized the model, the result is wrong. You can use the tflite I provided it in the models directory to quantize.
Thanks, but if you have trained the model with TensorFlow object detection you may have exported it using -export_tflite_ssd_graph so can you share the tflitegraph.pb or the trained checkpoints of Tensorflow object detection API if you only trained it.
I also coverted the pd offered by the project to tflite, but the u8 tflite turns to a wrong result. When I attempted to suit the situation, it turn to a result that the max conf of about 30. And the anchors' shape seemed unusuall. 吐槽一下,写英语真难
Add the model checkpoint for conversion to tflite as it can be used for post-processing optimization for FLOAT and INT Quantization model.