Closed yashkumaratri closed 5 years ago
Don't serialize the results at the time of inference, take the ineference and probability factor and then serialize it later.
Whole server side code :
import argparse
from http import server
from sklearn import datasets, linear_model
from sklearn.metrics import mean_squared_error, r2_score
from graphpipe import convert
from sklearn.ensemble import RandomForestClassifier
def create_diabetes_model():
diabetes = datasets.load_diabetes()
diabetes_X = diabetes.data
print(diabetes_X.shape)
diabetes_X_train = diabetes_X[:-20]
diabetes_y_train = diabetes.target[:-20]
clf = RandomForestClassifier(n_estimators=100, max_depth=2,random_state=0)
#model = linear_model.LinearRegression()
clf.fit(diabetes_X_train, diabetes_y_train)
return clf
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("--port", default=10000, help="TCP port", type=int)
args = parser.parse_args()
server_address = ('', args.port)
class GPHandler(server.BaseHTTPRequestHandler):
def do_POST(self):
inp = self.rfile.read(int(self.headers['Content-Length']))
obj = convert.deserialize_request(inp).input_tensors[0]
outp = model.predict(obj)
outprob = model.predict_proba(obj)
lis = []
lis.append(outp)
lis.append(outprob)
out = convert.serialize_infer_response(lis)
self.send_response(200)
self.end_headers()
self.wfile.write(out)
httpd = server.HTTPServer(server_address, GPHandler)
model = create_diabetes_model()
print('Starting graphpipe sklearn server on port %d...' % args.port)
while(True):
httpd.handle_request()
You can get the response in pred
of client and can get the values by pred[0]
and pred[1]
Following sklearn tute (https://github.com/oracle/graphpipe-py/blob/master/examples/sklearn_example/server.py), It returns the response but I need to return the model repsonse as well as probability factor, Can't concat the output of bytearray, So how to do it?
I want to send outp and outp1 both , How?