Closed Extremesarova closed 2 years ago
This model can be converted using TF_SELECT option, since it has some unsupported operations by TF Lite. See https://www.tensorflow.org/lite/guide/ops_select Also, added @abattery who converted this model before and can guide you if needed.
@abattery Hi! Could you help me? This approach doesn't work either:
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
I'm getting the same error: "ValueError: Input 1 of node StatefulPartitionedCall was passed float from statefulpartitionedcall_args_1:0 incompatible with expected resource."
Sorry @Extremesarova
Actually, this model requires e2e hash table support. We are working on delivering the e2e hash table feature. I will update this thread when the feature is landed.
Got it. Could you, please, suggest some pretrained multilingual embeddings from tf.hub, that can be converted to Tensorflow Lite?
I had experiments on converting models in TF hub, especially for the models which need a hash table support. Except few exceptions, most of models will be convertible to TFLite after e2e hash table proposal including https://tfhub.dev/google/universal-sentence-encoder-multilingual/3.
@abattery Hi! Could you help me with some workaround before delivering this feature? May you provide e2e hash table or something so that I can convert multilingual USE?
I had experiments on converting models in TF hub, especially for the models which need a hash table support. Except few exceptions, most of models will be convertible to TFLite after e2e hash table proposal including https://tfhub.dev/google/universal-sentence-encoder-multilingual/3.
I'm also curious if there are any updates? Or a time-line
Hi, guys! Any updates on the problem?
@abattery Is there any update on the e2e hash table? I'm also running into this issue. Thanks!
Any update would be great!
There is a better support on the recent TF version through the tf.lite.TFLiteConverter.from_saved_model.
was able to do the conversion using the latest tf version, however was running into problems using it on mobile.
this conversion worked for me
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
But on mobile i get the error
Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference
I think https://www.tensorflow.org/lite/guide/ops_select#android_aar may be related, but wasnt able to get it to work as of now.
see https://github.com/am15h/tflite_flutter_plugin/issues/101 for more details
@Extremesarova,
We are checking to see if this is still an issue. Can you try updating TF to latest stable version of i.e 2.6.0
and let us know if the issue persists? Thanks!
@sanatmpa1 I will definitely try it one more time and will be back with updated information. Last time I've checked I was able to covert to tf-lite format and test inference of tf-lite model on Python backend, but unfortunately inference on Android device wasn't successful
@Extremesarova,
Thanks for the update and this guide can be a good reference for you to test inference of tf-lite model.
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
@sanatmpa1, hi! I've tried everything and the inference on Android's backend doesn't work - error concerns the lack of SentencepieceOp. But this is another issue, because this issue was about error in convertion to tf-lite - and it seems to work now.
@Extremesarova,
Thanks for the confirmation. In this case where it seems to work now, Can you close this issue and open a new one for Android backend issue, if it still persists in recent stable version?
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.
System information
Command used to run the converter or code if you’re using the Python API If possible, please share a link to Colab/Jupyter/any notebook.
https://tfhub.dev/google/universal-sentence-encoder-multilingual/3
I've tried the large model also and got the same error. Can someone help me?