hey, I exported the model using the command it provide. but when I predict new query, it always output the same thing. I tried to check few things and find that response = stub.Predict(request, timeout_secs) (from serving_utils.py ) always return the same value. any idea whats got wrong here ?
Hallo(my input is this )
the output is : In fact, the government does not want to protect itself or to do so. It is only a question, bigger than any other, that is necessary.
no matter what my input is I always got the same result.
...
Environment information
OS: <your answer here>
mesh-tensorflow==0.0.5
tensor2tensor==1.11.0
tensorboard==1.12.0
tensorflow==1.12.0
tensorflow-gpu==1.10.1
tensorflow-hub==0.1.1
tensorflow-metadata==0.9.0
tensorflow-probability==0.5.0
tensorflow-serving-api==1.12.0
tensorflow-tensorboard==1.5.1
$ pip freeze | grep tensor
# your output here
$ python -V
# your output here
I too got the same results for question answers problem for every test query after training the model for 130k steps and the loss for training had gone down to less than 1.
Description
hey, I exported the model using the command it provide. but when I predict new query, it always output the same thing. I tried to check few things and find that response = stub.Predict(request, timeout_secs) (from serving_utils.py ) always return the same value. any idea whats got wrong here ?
Environment information
Python 3.6.7 :: Anaconda, Inc.
For bugs: reproduction and error logs