Closed ravimadhusudhan8 closed 5 years ago
Hi, @ravimadhusudhan8. ML Lambda communicates with the model via TensorProto objects. The message states that you cannot multiply two TensorProto objects together. You have to retrieve the content from the TensorProtos, perform add/multiply/substract operations on that content and then pack it back to the response TensorProto object. You can investigate error logs on your own by looking at the container logs.
Got it!
Thanks
Changed the code to this - based on func_main.py at hydro-serving-example/demos/stateful_lstm_example/pipeline/postprocessing/src
And using TensorProto objects as input and output as suggested. Attached are the docker logs and test app screenshots.
thanks
import math import hydro_serving_grpc as hs
def multiply(x, y): input_x = x.double_val input_y = y.double_val print(input_x, input_y) result = input_x * input_y tensor_result_output = hs.TensorProto( dtype=hs.DT_DOUBLE, double_val=result ) return tensor_result_output
Got this error message - status: 500, error: InternalUncaught, message: UNKNOWN: Exception calling application: unsupported operand type(s) for *: 'google.protobuf.pyext._message.RepeatedScalarContainer' and 'google.protobuf.pyext._message.RepeatedScalarContainer'
Still, you're operating on the object, that doesn't support *
operand. This time it's a protobuf's RepeatedScalarContainer
object, which is similar to list. In your case those would be a lists of length 1. So, the correct steps would be:
...
result = input_x[0] * input_y[0]
tensor_result_output = hs.TensorProto(
dtype=hs.DT_DOUBLE,
double_val=[result]
)
...
Note, that I've made a list from result
and passed it to double_val
key in the response tensor constructor.
For future reference, the documentation for the python bindings to protocol buffers explains this and other issues.
Thanks for the link and suggestions.
Getting a different error with the changed code
import math import hydro_serving_grpc as hs def subtract(x, y): input_x = x.double_val input_y = y.double_val print(input_x[0], input_y[0]) result = input_x[0] - input_y[0] tensor_result_output = hs.TensorProto( dtype=hs.DT_DOUBLE, double_val=[result] ) return tensor_result_output
ML Lambda expects a hs.PredictResponse
object to be returned from the python model. 4-th step here.
Great that worked! You have been very responsive! Thanks!
Created a Model -that was a simple calculator service - with standard add, subtract, multiply ... that take 2 parameters and output a single scalar.
The Model uploaded fine - no issues [the earlier suggestion helped].
Created an Application.
Attached screenshots -