Open mshearer0 opened 4 years ago
In addition to the above the following predicts locally using saved_model_cli (pg 140) - replace model number:
cli_input="examples=" + repr([example.SerializeToString()]).replace("'", "\\"") !saved_model_cli run --dir ../../interactive-pipeline/serving_model_dir/1600007776/ \ --tag_set serve \ --signature_def serving_default \ --input_exprs='{cli_input}'
Certain values, eg 'sub_product': 'Savings account', cause local prediction to fail but GCP works fine.
Difference seems to be those value Serialize to include a " character causing:
SyntaxError: unexpected character after line continuation character
After deploying model to GCP i found predictions required features 'company' and 'timely response' although these are not used in model. The other features below were also required but 'state' and 'zipcode' were not.
exampledict = {'product': 'Bank account or service', 'sub_product': 'Savings account', 'timely_response': 'No', 'company_response': 'Closed with monetary relief' ,'issue':'Cash advance','company':'test','consumer_complaint_narrative':'happy with the service' } featuredict = {}
for key,value in exampledict.items():
featuredict[key]=tf.train.Feature(bytes_list=tf.train.BytesList(value=[value.encode('utf-8')])),
example = tf.train.Example( features=tf.train.Features(feature=featuredict) )
input_data_json = { "signature_name":"serving_default", "instances":[ { "examples":{"b64": base64.b64encode(example.SerializeToString()).decode('utf-8')} } ] }
request = ml_resource.predict(name=model_path, body=input_data_json) response = request.execute() if "error" in response: raise RuntimeError(response["error"]) for pred in response["predictions"]: print(pred)
[0.123692594]