Open cceasy opened 3 weeks ago
Hi @cceasy, Thank you for reporting. I was able to reproduce the issue. I will check on this internally and update here. Below is the error:
2024-11-12 09:34:40.365027: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
E0000 00:00:1731404080.457354 107 mlir_bridge_pass_util.cc:68] Failed to parse __inference_serving_fn_19270: Op type not registered 'TFText>RoundRobinTrim' in binary running on 58d2778e1319. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g. `tf.contrib.resampler`), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
I0000 00:00:1731404080.461934 107 mlir_graph_optimization_pass.cc:401] MLIR V1 optimization pass is not enabled
.
.
2024-11-12 09:34:40.505939: E external/org_tensorflow/tensorflow/core/grappler/optimizers/tfg_optimizer_hook.cc:135] tfg_optimizer{any(tfg-consolidate-attrs,tfg-toposort,tfg-shape-inference{graph-version=0},tfg-prepare-attrs-export)} failed: INVALID_ARGUMENT: Unable to find OpDef for TFText>RoundRobinTrim
While importing function: __inference_serving_fn_19270
when importing GraphDef to MLIR module in GrapplerHook
Thank you!
I am trying to use tensorflow serving to serve a keras bert model, but I have problem to predict with rest api, below are informations. Can you please help me to resolve this problem.
predict output (ERROR)
{ "error": "Op type not registered 'TFText>RoundRobinTrim' in binary running on ljh-my-keras-bert-model. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g.
tf.contrib.resampler
), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed." }my local versions
model definition
save the model to local path
build the tensorflow serving docker image
predict request
POST http://localhost:8080/v1/models/model/versions/1:predict Content-Type: application/json
{"instances": ["What an amazing movie!", "A total waste of my time."]}