tobegit3hub / simple_tensorflow_serving

Generic and easy-to-use serving service for machine learning models
https://stfs.readthedocs.io
Apache License 2.0
757 stars 193 forks source link

请问用image做inference的时候这个报错是什么意思? #21

Closed CyLouisKoo closed 6 years ago

CyLouisKoo commented 6 years ago

image

tobegit3hub commented 6 years ago

It seems that you are requesting with the "input" data but it is not in the model signature.

For image inference, we recommend you to export the image model with "images" as inputs and the clients should request with the "images" data instead of "input".

CyLouisKoo commented 6 years ago

How to do it specifically? Suppose I have the following ready-made model. How can I quickly call this model, is there a ready-made solution? Appreciate much for your answer image

tobegit3hub commented 6 years ago

Thanks for your response. I have updated the deep_image_model with image base64 as input.

Now you can git pull to get the latest pre-trained model and test with this commands.

simple_tensorflow_serving --model_base_path="./models/deep_image_platforms"

curl -X POST -F 'image=@./images/mew.jpg' -F "model_version=1" 127.0.0.1:8500

Or go to the dashboard in http://localhost:8500/ and upload the image to make inference.

CyLouisKoo commented 6 years ago

Sorry for the late reply. Both of the above methods can be run through. Appreciate much and I will study your code framework again.

tobegit3hub commented 6 years ago

Great and thanks for reporting the issue.

I'm gonna close the issue and feel free to re-open if you have any other question.