Closed shayonc closed 6 years ago
The problem was with setting this constant in our signature_constants: signature_constants.CLASSIFY_INPUTS. Changing it caused it to work: request.inputs[signature_constants.PREDICT_INPUTS].CopyFrom(tf.contrib.util.make_tensor_proto(input_nums, shape=[1,100],dtype=tf.int32))
My team and I worked on a text classification model using an LSTM and saved it using the saved_model_builder. We then tested it using the saved_model_cli, and inference was working. However, when we ran the ModelServer and called it with a gRPC call, it is giving us the following error:
We are not sure why this variable is not being initialized in this case, but it is working fine with saved_model_cli. Has anyone faced this issue?
Here is our saved_model script:
Here is some information about our saved model:
Here is our gRPC client request:
Thanks in advance!