tensorflow / tensorflow

An Open Source Machine Learning Framework for Everyone
https://tensorflow.org
Apache License 2.0
185.39k stars 74.17k forks source link

Error converting multilingual universal sentence encoder to TFLite. Input 1 of node StatefulPartitionedCall was passed float from statefulpartitionedcall_args_1:0 incompatible with expected resource. #42366

Closed Extremesarova closed 2 years ago

Extremesarova commented 4 years ago

System information

Command used to run the converter or code if you’re using the Python API If possible, please share a link to Colab/Jupyter/any notebook.

# I've downloaded model and unarchived it to save_path
converter = tf.lite.TFLiteConverter.from_saved_model(save_path)
tflite_model = converter.convert()
InvalidArgumentError                      Traceback (most recent call last)
~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
    496         results = c_api.TF_GraphImportGraphDefWithResults(
--> 497             graph._c_graph, serialized, options)  # pylint: disable=protected-access
    498         results = c_api_util.ScopedTFImportGraphDefResults(results)

InvalidArgumentError: Input 1 of node StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall was passed float from Func/StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/input/_1007:0 incompatible with expected resource.

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-10-55fd8585264a> in <module>
      1 #convert model to tensorflow lite
      2 converter = tf.lite.TFLiteConverter.from_saved_model(save_path)
----> 3 tflite_model = converter.convert()
      4 # open("converted_model.tflite", "wb").write(tflite_model)

~/.local/lib/python3.7/site-packages/tensorflow/lite/python/lite.py in convert(self)
   1074         Invalid quantization parameters.
   1075     """
-> 1076     return super(TFLiteConverterV2, self).convert()
   1077 
   1078 

~/.local/lib/python3.7/site-packages/tensorflow/lite/python/lite.py in convert(self)
    876     frozen_func, graph_def = (
    877         _convert_to_constants.convert_variables_to_constants_v2_as_graph(
--> 878             self._funcs[0], lower_control_flow=False))
    879 
    880     input_tensors = [

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/convert_to_constants.py in convert_variables_to_constants_v2_as_graph(func, lower_control_flow, aggressive_inlining)
   1107 
   1108   frozen_func = _construct_concrete_function(func, output_graph_def,
-> 1109                                              converted_input_indices)
   1110   return frozen_func, output_graph_def
   1111 

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/convert_to_constants.py in _construct_concrete_function(func, output_graph_def, converted_input_indices)
    999   new_func = wrap_function.function_from_graph_def(output_graph_def,
   1000                                                    new_input_names,
-> 1001                                                    new_output_names)
   1002 
   1003   # Manually propagate shape for input tensors where the shape is not correctly

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in function_from_graph_def(graph_def, inputs, outputs)
    648     importer.import_graph_def(graph_def, name="")
    649 
--> 650   wrapped_import = wrap_function(_imports_graph_def, [])
    651   import_graph = wrapped_import.graph
    652   return wrapped_import.prune(

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in wrap_function(fn, signature, name)
    626           signature=signature,
    627           add_control_dependencies=False,
--> 628           collections={}),
    629       variable_holder=holder,
    630       signature=signature)

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    984         _, original_func = tf_decorator.unwrap(python_func)
    985 
--> 986       func_outputs = python_func(*func_args, **func_kwargs)
    987 
    988       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in __call__(self, *args, **kwargs)
     85 
     86   def __call__(self, *args, **kwargs):
---> 87     return self.call_with_variable_creator_scope(self._fn)(*args, **kwargs)
     88 
     89   def call_with_variable_creator_scope(self, fn):

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in wrapped(*args, **kwargs)
     91     def wrapped(*args, **kwargs):
     92       with variable_scope.variable_creator_scope(self.variable_creator_scope):
---> 93         return fn(*args, **kwargs)
     94 
     95     return wrapped

~/.local/lib/python3.7/site-packages/tensorflow/python/eager/wrap_function.py in _imports_graph_def()
    646 
    647   def _imports_graph_def():
--> 648     importer.import_graph_def(graph_def, name="")
    649 
    650   wrapped_import = wrap_function(_imports_graph_def, [])

~/.local/lib/python3.7/site-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    505                 'in a future version' if date is None else ('after %s' % date),
    506                 instructions)
--> 507       return func(*args, **kwargs)
    508 
    509     doc = _add_deprecated_arg_notice_to_docstring(

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in import_graph_def(***failed resolving arguments***)
    403       return_elements=return_elements,
    404       name=name,
--> 405       producer_op_list=producer_op_list)
    406 
    407 

~/.local/lib/python3.7/site-packages/tensorflow/python/framework/importer.py in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
    499       except errors.InvalidArgumentError as e:
    500         # Convert to ValueError for backwards compatibility.
--> 501         raise ValueError(str(e))
    502 
    503     # Create _DefinedFunctions for any imported functions.

ValueError: Input 1 of node StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall was passed float from Func/StatefulPartitionedCall/sequential/keras_layer/StatefulPartitionedCall/StatefulPartitionedCall/input/_1007:0 incompatible with expected resource.

https://tfhub.dev/google/universal-sentence-encoder-multilingual/3

I've tried the large model also and got the same error. Can someone help me?

karimnosseir commented 4 years ago

This model can be converted using TF_SELECT option, since it has some unsupported operations by TF Lite. See https://www.tensorflow.org/lite/guide/ops_select Also, added @abattery who converted this model before and can guide you if needed.

Extremesarova commented 4 years ago

@abattery Hi! Could you help me? This approach doesn't work either:

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()

I'm getting the same error: "ValueError: Input 1 of node StatefulPartitionedCall was passed float from statefulpartitionedcall_args_1:0 incompatible with expected resource."

abattery commented 4 years ago

Sorry @Extremesarova

Actually, this model requires e2e hash table support. We are working on delivering the e2e hash table feature. I will update this thread when the feature is landed.

Extremesarova commented 4 years ago

Got it. Could you, please, suggest some pretrained multilingual embeddings from tf.hub, that can be converted to Tensorflow Lite?

abattery commented 4 years ago

I had experiments on converting models in TF hub, especially for the models which need a hash table support. Except few exceptions, most of models will be convertible to TFLite after e2e hash table proposal including https://tfhub.dev/google/universal-sentence-encoder-multilingual/3.

Extremesarova commented 3 years ago

@abattery Hi! Could you help me with some workaround before delivering this feature? May you provide e2e hash table or something so that I can convert multilingual USE?

splch commented 3 years ago

I had experiments on converting models in TF hub, especially for the models which need a hash table support. Except few exceptions, most of models will be convertible to TFLite after e2e hash table proposal including https://tfhub.dev/google/universal-sentence-encoder-multilingual/3.

I'm also curious if there are any updates? Or a time-line

Extremesarova commented 3 years ago

Hi, guys! Any updates on the problem?

jasonw247 commented 3 years ago

@abattery Is there any update on the e2e hash table? I'm also running into this issue. Thanks!

thecosta commented 3 years ago

Any update would be great!

abattery commented 3 years ago

There is a better support on the recent TF version through the tf.lite.TFLiteConverter.from_saved_model.

anovis commented 3 years ago

was able to do the conversion using the latest tf version, however was running into problems using it on mobile.

this conversion worked for me

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

But on mobile i get the error

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference

I think https://www.tensorflow.org/lite/guide/ops_select#android_aar may be related, but wasnt able to get it to work as of now.

see https://github.com/am15h/tflite_flutter_plugin/issues/101 for more details

sanatmpa1 commented 2 years ago

@Extremesarova,

We are checking to see if this is still an issue. Can you try updating TF to latest stable version of i.e 2.6.0 and let us know if the issue persists? Thanks!

Extremesarova commented 2 years ago

@sanatmpa1 I will definitely try it one more time and will be back with updated information. Last time I've checked I was able to covert to tf-lite format and test inference of tf-lite model on Python backend, but unfortunately inference on Android device wasn't successful

sanatmpa1 commented 2 years ago

@Extremesarova,

Thanks for the update and this guide can be a good reference for you to test inference of tf-lite model.

google-ml-butler[bot] commented 2 years ago

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

Extremesarova commented 2 years ago

@sanatmpa1, hi! I've tried everything and the inference on Android's backend doesn't work - error concerns the lack of SentencepieceOp. But this is another issue, because this issue was about error in convertion to tf-lite - and it seems to work now.

sanatmpa1 commented 2 years ago

@Extremesarova,

Thanks for the confirmation. In this case where it seems to work now, Can you close this issue and open a new one for Android backend issue, if it still persists in recent stable version?

google-ml-butler[bot] commented 2 years ago

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

google-ml-butler[bot] commented 2 years ago

Closing as stale. Please reopen if you'd like to work on this further.

google-ml-butler[bot] commented 2 years ago

Are you satisfied with the resolution of your issue? Yes No