tensorflow / serving

A flexible, high-performance serving system for machine learning models
https://www.tensorflow.org/serving
Apache License 2.0
6.18k stars 2.19k forks source link

Error: Requested more than 0 entries, but params is empty. #1186

Closed echan00 closed 5 years ago

echan00 commented 5 years ago

Trying to serve my Chinese to English model and am having trouble querying. I am receiving an error "Requested more than 0 entires, but params is empty."

(test) root@ubuntu-c-8-16gib-sfo2-01:~/T2T_Model# t2t-query-server   --server=0.0.0.0:9000   --servable_name=transformer   --problem=translate_enzh_wmt32k_rev   --data_dir=/root/T2T_Model/t2t_data   --inputs_once='Hello my name is John.'

Traceback (most recent call last):
  File "/usr/local/bin/t2t-query-server", line 17, in <module>
    tf.app.run()
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "/usr/local/bin/t2t-query-server", line 12, in main
    query.main(argv)
  File "/usr/local/lib/python2.7/dist-packages/tensor2tensor/serving/query.py", line 89, in main
    outputs = serving_utils.predict([inputs], problem, request_fn)
  File "/usr/local/lib/python2.7/dist-packages/tensor2tensor/serving/serving_utils.py", line 157, in predict
    predictions = request_fn(examples)
  File "/usr/local/lib/python2.7/dist-packages/tensor2tensor/serving/serving_utils.py", line 113, in _make_grpc_request
    response = stub.Predict(request, timeout_secs)
  File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 533, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking
    raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
    status = StatusCode.INVALID_ARGUMENT
    details = "Requested more than 0 entries, but params is empty.  Params shape: [1,4,8,0,64]
     [[{{node transformer/while/GatherNd_32}} = GatherNd[Tindices=DT_INT32, Tparams=DT_FLOAT, _output_shapes=[[?,8,?,?,64]], _device="/job:localhost/replica:0/task:0/device:CPU:0"](transformer/while/Reshape_65, transformer/while/stack)]]"
    debug_error_string = "{"created":"@1542086942.107507941","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1017,"grpc_message":"Requested more than 0 entries, but params is empty.  Params shape: [1,4,8,0,64]\n\t [[{{node transformer/while/GatherNd_32}} = GatherNd[Tindices=DT_INT32, Tparams=DT_FLOAT, _output_shapes=[[?,8,?,?,64]], _device="/job:localhost/replica:0/task:0/device:CPU:0"](transformer/while/Reshape_65, transformer/while/stack)]]","grpc_status":3}"
>

The model server seems to be working fine and responding with the same error:

(test) root@ubuntu-c-8-16gib-sfo2-01:~/T2T_Model# tensorflow_model_server   --port=9000   --model_name=transformer   --model_base_path=/root/T2T_Model/t2t_train/translate_enzh_wmt32k/transformer-transformer_base/export
2018-11-13 05:28:29.116290: I tensorflow_serving/model_servers/server.cc:82] Building single TensorFlow model file config:  model_name: transformer model_base_path: /root/T2T_Model/t2t_train/translate_enzh_wmt32k/transformer-transformer_base/export
2018-11-13 05:28:29.116412: I tensorflow_serving/model_servers/server_core.cc:461] Adding/updating models.
2018-11-13 05:28:29.116424: I tensorflow_serving/model_servers/server_core.cc:558]  (Re-)adding model: transformer
2018-11-13 05:28:29.216782: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: transformer version: 1542073770}
2018-11-13 05:28:29.216806: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: transformer version: 1542073770}
2018-11-13 05:28:29.216815: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: transformer version: 1542073770}
2018-11-13 05:28:29.216830: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /root/T2T_Model/t2t_train/translate_enzh_wmt32k/transformer-transformer_base/export/1542073770
2018-11-13 05:28:29.216838: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /root/T2T_Model/t2t_train/translate_enzh_wmt32k/transformer-transformer_base/export/1542073770
2018-11-13 05:28:29.537966: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2018-11-13 05:28:29.597214: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
2018-11-13 05:28:29.722289: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:162] Restoring SavedModel bundle.
2018-11-13 05:28:30.139345: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:138] Running MainOp with key saved_model_main_op on SavedModel bundle.
2018-11-13 05:28:30.227063: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:259] SavedModel load for tags { serve }; Status: success. Took 1010210 microseconds.
2018-11-13 05:28:30.227116: I tensorflow_serving/servables/tensorflow/saved_model_warmup.cc:83] No warmup data file found at /root/T2T_Model/t2t_train/translate_enzh_wmt32k/transformer-transformer_base/export/1542073770/assets.extra/tf_serving_warmup_requests
2018-11-13 05:28:30.227223: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: transformer version: 1542073770}
2018-11-13 05:28:30.229398: I tensorflow_serving/model_servers/server.cc:286] Running gRPC ModelServer at 0.0.0.0:9000 ...

2018-11-13 05:59:38.052592: W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at gather_nd_op.cc:50 : Invalid argument: Requested more than 0 entries, but params is empty.  Params shape: [1,4,8,0,64]

This is my data request:

>> model_spec {
  name: "transformer"
}
inputs {
  key: "input"
  value {
    dtype: DT_STRING
    tensor_shape {
      dim {
        size: 1
      }
    }
    string_val: "\n\021\n\017\n\006inputs\022\005\032\003\n\001\001"
  }
}

My environment: tensor2tensor (1.10.0) tensorboard (1.12.0) tensorflow (1.12.0) tensorflow-serving-api (1.12.0)

Would appreciate any help with this issue, it seems like it is related to Chinese -> English translation.

haukurb commented 5 years ago

I'm having the same problem

originally trained model on: tensor2tensor==1.9.0 tensorboard==1.10.0 tensorflow-gpu==1.10.1

And received the following errors: { "error": "Reshape cannot infer the missing input size for an empty tensor unless all specified input sizes are non-zero\n\t [[{{node transformer/body/parallel_0/body/encoder/layer_0/self_attention/multihead_attention/dot_product_attentio...

Updated to: mesh-tensorflow==0.0.3 tensor2tensor==1.10.0 tensorboard==1.12.0 tensorflow==1.12.0 tensorflow-gpu==1.10.1 tensorflow-serving-api==1.12.0

And now receive the following error: { "error": "Requested more than 0 entries, but params is empty. Params shape: [1,6,8,0,64]\n\t [[{{node transformer/while/GatherNd_17}}]]\n\t [[{{node transformer/strided_slice_11/_1015}}]]" }

Generated from the following post request, (the same happens with gRPC): curl --header "Content-Type: application/json" \ --request POST \ http://localhost:9001/v1/models/transformer:predict \ --data '{"instances":[{"input":{"b64":"CiAKHgoGaW5wdXRzEhQaEgoQpweFF0oOBsI+EclIBP4cAQ=="}}],"signature_name":"serving_default"}' \

echan00 commented 5 years ago

@haukurb https://github.com/tensorflow/tensor2tensor/issues/1219

haukurb commented 5 years ago

Great, thanks.