Closed Sklaebe closed 8 months ago
This seems more like a TF question, and probably one they would say to ask on Stack Overflow.
Tensorflow-text doesn't currently expose a static library containing all the ops, but you could probably create one based on the :ops_lib
target. Otherwise, you need to load all the op libraries individually - for example, the pip package contains the shared library: python/ops/_fast_bert_normalizer.so
, which when loaded should register the op.
I have an application where I want to use a saved tensorflow model for serving. I tried to serve pre-trained huggingface models for this purpose, for example a Bert model. In order to combine both tokenizer and model into a single custom model, I build and saved my model as follows:
I can load the model again in Python by specifying custom objects used in the model as follows:
I did not find a solution yet to load the model over the C-API in the application. Code is like follows:
When running, I get the following output:
and the error message retreived from the status is
There have been similar issues in tensorflow/serving, e.g. https://groups.google.com/a/tensorflow.org/g/developers/c/LUvQAm3BsAs, and there is also an explanation how to use custom ops in serving: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/custom_op.md. These focus on usage in Python. Is there any any way to use tensorflow-text operations in models served using the tensorflow C-API? I suppose that there would be a library needed that can be linked against the application, similar to the tensorflow c libraries.
Python modules tensorflow and tensorflow-text as well as the tensorflow c-api libraries are in version 2.13.0.