Open friesel opened 6 years ago
I have a similar problem with transformers
The query script is meant to be an example. Please copy it and modify it for your own use case. In this case, you need to send along inputs and targets (even though targets will not actually be used). The reason is because the serving input function uses the same example parsing logic as is used during training, so whatever is required at training time must be sent at serving time. You could alternatively modify the serving input function to change the parsing logic.
Modifying the _make_example
function to include targets will enable you to send along targets
def _make_example(input_ids, feature_name="inputs"):
features = {
feature_name:
tf.train.Feature(int64_list=tf.train.Int64List(value=input_ids)),
"targets":tf.train.Feature(int64_list=tf.train.Int64List(value=[5]))
}
return tf.train.Example(features=tf.train.Features(feature=features))
Description
I created a Text2ClassProblem including 7 labels. When I use the exported model in tensorflow_model_server and send a request using t2t-query-server, the application crashes.
I used:
Environment information
OS: Ubuntu 16.04
$ pip freeze | grep tensor tensor2tensor==1.6.3 tensorboard==1.7.0 tensorflow-gpu==1.7.0
$ python -V Python 2.7.12
Steps to reproduce:
Create a Text2ClassProblem and assign label ids 0-6. t2t-datagen t2t-trainer t2t-exporter tensorflow_model_server t2t-query-server (send a request, then it should crash)
Error logs:
Traceback (most recent call last): File "/home//dev/git/prinvision/nets/test_doc_to_language.py", line 22, in
outputs = service.process(inputs)
File "/home/ /dev/git/prinvision/nets/public_service_api_v3.py", line 50, in process
outputs = serving_utils.predict([input_string], self.problem, self.request_fn)
File "/home//.local/lib/python2.7/site-packages/tensor2tensor/serving/serving_utils.py", line 118, in predict
predictions = request_fn(examples)
File "/home//.local/lib/python2.7/site-packages/tensor2tensor/serving/serving_utils.py", line 75, in _make_grpc_request
response = stub.Predict(request, timeout_secs)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 309, in call
self._request_serializer, self._response_deserializer)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 195, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.INVALID_ARGUMENT, details="Feature: targets (data type: int64) is required but could not be found.
[[Node: ParseSingleExample/ParseSingleExample = ParseSingleExample[Tdense=[DT_INT64, DT_INT64], dense_keys=["batch_prediction_key", "targets"], dense_shapes=[[1], [1]], num_sparse=1, sparse_keys=["inputs"], sparse_types=[DT_INT64]](arg0, ParseSingleExample/Reshape, ParseSingleExample/Const)]]
[[Node: DatasetToSingleElement = DatasetToSingleElementoutput_shapes=[[?,1], [?,?,1,1], [?,1,1,1]], output_types=[DT_INT32, DT_INT32, DT_INT32], _device="/job:localhost/replica:0/task:0/device:CPU:0"]]")