Open cceasy opened 1 day ago
Python 3.10.14
tensorflow 2.18.0
tensorflow-datasets 4.9.6
tensorflow-io-gcs-filesystem 0.37.1
tensorflow-metadata 1.16.1
tensorflow-text 2.18.0
keras 3.6.0
keras-hub-nightly 0.16.1.dev202410210343
keras-nlp 0.17.0
I am trying to use tensorflow serving to serve a keras bert model, but I have problem to predict with rest api, below are informations. Can you please help me to resolve this problem.
model definition
save the model to local path
build the tensorflow serving docker image
predict request
POST http://localhost:8501/v1/models/my_keras_bert_model/versions/1:predict Content-Type: application/json
{"instances": ["What an amazing movie!", "A total waste of my time."]}
predict output (ERROR)
{ "error": "Op type not registered 'TFText>RoundRobinTrim' in binary running on ljh-my-keras-bert-model. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g.
tf.contrib.resampler
), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed." }