Open ansarisam opened 2 years ago
@ansarisam I think the model expects to receive the data in byte, not float, you can try to replace the line
image = np.array(image).astype('float')
by
image = np.array(image).astype(np.byte)
Thanks for the reply.
Changing to image = np.array(image).astype(np.byte) did not solve the problem.
The new error now is:
b'{"error":"Error code - ORT_INVALID_ARGUMENT - message: Unexpected input data type. Actual: (N11onnxruntime17PrimitiveDataTypeIaEE) , expected: (N11onnxruntime17PrimitiveDataTypeIhEE)"}'
@ansarisam There is a thread (https://github.com/microsoft/onnxruntime/issues/6261) discussing the issue, currently ai-serving
depends on the onnxruntime 1.6.0 that does not support UINT8 yet in the Java API. We need to pick up the latest onnxruntime that should include the commit (https://github.com/microsoft/onnxruntime/pull/8401), it's not a simple task, we need more time to finish them.
BTW, is there an ONNX model that I can reproduce the error above? so that I can verify it after the update done
@ansarisam Could you try the latest code of ai-serving? which can support the uint8 now, invoke the following code to convert to uint8:
image = np.array(image).astype('uint8')
NOTE: the new docker images with the fix are not ready yet, we're working on them. You need to compile the code to try. Please let me know if you have any problems.
I am getting an error when trying to predict from ONNX model (TensorFlow based Object Detection). When I call the api-serving Rest API, I get the following error
b'{"error":"Shape [640, 640, 3], requires 1228800 elements but the buffer has 9830400 elements."}'
Here is the code that is creating the input and calling the Rest API.