Closed aburkard closed 5 years ago
I've solved this by using the regress
endpoint instead:
curl -d '{"signature_name": "regression", "examples": [{"column_1": [436772], "column_2": [1681]}, {"column_1": 1100, "column_2": 2000}]}' \
-X POST http://localhost:8501/v1/models/my_model:regress
On a side note, I also noticed the existence of tfr.data.build_ranking_serving_input_receiver_fn
, which seems to be the correct way to do things as discussed in other issues. However, that seems to make things even more challenging as I'd have to serialize everything the protobufs first.
Thanks for your questions and figure out a solution on your end. tfr.data.build_ranking_serving_input_receiver_fn
is the listwise serving and it accepts ExampleListWithContext, but not tf.Example. I think your curl command line above send a tf.Example, right?
I'm having trouble figuring out how to get predictions from a ranking model running on the TensorFlow Serving Docker container.
I was able to successfully export by model to a
SavedModel
as described in https://github.com/tensorflow/ranking/issues/53 and other issues.I'm also able to use the
saved_model_cli
to view the signature and run inference:But I can't for the life of me figure out what the payload should be to my docker container.
I've tried many variations of nesting the keys
inputs, instances, and examples
, as well asbase64
encoding the json input. Has anyone been able to successfully do this?