tobegit3hub / simple_tensorflow_serving

Generic and easy-to-use serving service for machine learning models
https://stfs.readthedocs.io
Apache License 2.0
757 stars 192 forks source link

Image inference error #47

Open NorthLatitudeOne opened 5 years ago

NorthLatitudeOne commented 5 years ago

Hi, can somebody help to have a look my problem, thanks! Image inference in web menu test image inference: Predict result: [{"error": "Inference error <class 'KeyError'>: 'input_image'"}, 400]

Image inference in command line: `E:\Anaconda\Lib\site-packages\simple_tensorflow_serving>curl -X POST -F 'image=images/NG0093.jpg' -F "model_version=3" 127.0.0.1:8500 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">

500 Internal Server Error

Internal Server Error

The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.

` Simple tensorflow serving log: `2019-04-18 09:00:46 ERROR Need to set image or images for form-data 2019-04-18 09:00:46 ERROR Exception on / [POST] Traceback (most recent call last): File "e:\anaconda\lib\site-packages\flask\app.py", line 2292, in wsgi_app response = self.full_dispatch_request() File "e:\anaconda\lib\site-packages\flask\app.py", line 1815, in full_dispatch_request rv = self.handle_user_exception(e) File "e:\anaconda\lib\site-packages\flask_cors\extension.py", line 161, in wrapped_function return cors_after_request(app.make_response(f(*args, **kwargs))) File "e:\anaconda\lib\site-packages\flask\app.py", line 1718, in handle_user_exception reraise(exc_type, exc_value, tb) File "e:\anaconda\lib\site-packages\flask\_compat.py", line 35, in reraise raise value File "e:\anaconda\lib\site-packages\flask\app.py", line 1813, in full_dispatch_request rv = self.dispatch_request() File "e:\anaconda\lib\site-packages\flask\app.py", line 1799, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 184, in decorated return f(*decorator_args, **decorator_kwargs) File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 308, in inference json_result, status_code = do_inference() File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 349, in do_inference if "model_name" in json_data: TypeError: argument of type 'NoneType' is not iterable 2019-04-18 09:00:46 INFO 127.0.0.1 - - [18/Apr/2019 09:00:46] "POST / HTTP/1.1" 500 -`
jeffin07 commented 5 years ago

@NorthLatitudeOne i got a similar error 'error': "Inference error <class 'KeyError'>: 'in'"} Did you find any solution for this ? I also tried with the given model in Readme still the same type of error {"error": "Inference error <class 'KeyError'>: 'features'"}

serlina commented 5 years ago

image I used docker to run the simple_tensorflow_serving in local, and I can open the 127.0.0.1:8500 docker run -d -p 8500:8500 tobegit3hub/simple_tensorflow_serving the python version is python2.7

while I used postman to send the post request for image inference client call, it also raise exception as below:

image

while the code for converting jpeg to base64 is : from PIL import Image import cv2 import cStringIO import base64

def base64_encode_img(img): """ :param img: :return: """ img_rgb = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) print('shape of img_rgb:', img_rgb.shape) pil_img = Image.fromarray(img_rgb)

buf = cStringIO.StringIO()
pil_img.save(buf, format="JPEG", quality=100)
b64code = base64.urlsafe_b64encode(buf.getvalue()) #web-safe
# b64code = base64.b64encode('abcdefgdisoaufd,0.342,0.456,0.987')  # not web-safe
print(b64code)
return b64code

if name == "main": img_BGR = cv2.imread('./mew.jpg') base64_encode_img(img_BGR)

could someone give me help about the image client call for the model

serlina commented 5 years ago

sorry ,correct my above code, I just used below code to convert local jpg to base64, and then used the base64 in postman (post body)

with open("./mew.jpg", "rb") as image_file: encoded_string = base64.b64encode(image_file.read()) print encoded_string

comments: if I use base64.urlsafe_b64encode to encode, it will raise another error

tobegit3hub commented 5 years ago

Hi @serlina , it depends on your TensorFlow SavedModel's op. If you use tf.decode_base64(model_base64_placeholder) to process the input data, you may try this client code which has test in our environment.

import requests
import base64

def main():
  image_file_name = "../../images/mew.jpg"
  image_b64_string = base64.urlsafe_b64encode(
      open(image_file_name, "rb").read())

  endpoint = "http://127.0.0.1:8500"
  input_data = {
      "data": {
          "image": [image_b64_string]
      }
  }
  result = requests.post(endpoint, json=input_data)