Open prabal27 opened 2 years ago
Hi @prabal27,
Thanks for the issue. We have not verified converting the model to TF-Lite framework, and thus we are not sure if it is fully supported.
Cheers,
Hello @aquariusjay , While using the "tf.lite.TFLiteConverter.from_saved_model()" fuction to convert the model, it is throwing following error:
###################################################################################################
ConverterError Traceback (most recent call last) Input In [1], in <cell line: 10>() 4 # converter.target_spec.supported_ops = [ 5 # tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. 6 # tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. 7 # ] 8 converter.optimizations = [tf.lite.Optimize.DEFAULT] ---> 10 tflite_model = converter.convert() 11 open("exported-model/TF-Lite/DeepLabv3Plus_model_new.tflite", "wb").write(tflite_model)
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/lite.py:929, in _export_metrics.
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/lite.py:908, in TFLiteConverterBase._convert_and_export_metrics(self, convert_func, *args, *kwargs) 906 self._save_conversion_params_metric() 907 start_time = time.process_time() --> 908 result = convert_func(self, args, *kwargs) 909 elapsed_time_ms = (time.process_time() - start_time) 1000 910 if result:
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/lite.py:1212, in TFLiteSavedModelConverterV2.convert(self) 1207 else: 1208 self._debug_info = _get_debug_info( 1209 _convert_debug_info_func(self._trackable_obj.graph_debug_info), 1210 graph_def) -> 1212 return self._convert_from_saved_model(graph_def)
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/lite.py:1095, in TFLiteConverterBaseV2._convert_from_saved_model(self, graph_def) 1092 converter_kwargs.update(self._get_base_converter_args()) 1093 converter_kwargs.update(quant_mode.converter_flags()) -> 1095 result = _convert_saved_model(**converter_kwargs) 1096 return self._optimize_tflite_model( 1097 result, quant_mode, quant_io=self.experimental_new_quantizer)
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/convert_phase.py:212, in convert_phase.
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/convert_phase.py:205, in convert_phase.
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/convert.py:809, in convert_saved_model(kwargs) 807 model_flags = build_model_flags(kwargs) 808 conversion_flags = build_conversion_flags(**kwargs) --> 809 data = convert( 810 model_flags.SerializeToString(), 811 conversion_flags.SerializeToString(), 812 input_data_str=None, 813 debug_info_str=None, 814 enable_mlir_converter=True) 815 return data
File ~/anaconda3/lib/python3.9/site-packages/tensorflow/lite/python/convert.py:311, in convert(model_flags_str, conversion_flags_str, input_data_str, debug_info_str, enable_mlir_converter) 309 for error_data in _metrics_wrapper.retrieve_collected_errors(): 310 converter_error.append_error(error_data) --> 311 raise converter_error 313 return _run_deprecated_conversion_binary(model_flags_str, 314 conversion_flags_str, input_data_str, 315 debug_info_str)
ConverterError:
The old DeepLab repo seems to have a script for generating TensorFlow-lite models called "convert_to_tflite.py". So, should I work with that repo if I want to generate TFLite models or are you guys planning to add a script for it?
Hi @prabal27,
Thanks for asking. We currently have no plan to add TF-Lite support yet, given the limited bandwith.
Cheers,
Hello! I wanted to convert the generated model to TF-Lite format. I first froze the model using the export_model.py script and then used tflite_convert script. Is it the right thing to do or you guys recommend using any special instructions while generating the TF-Lite model from the exported model?