ksalama / tf-estimator-tutorials

This repository includes tutorials on how to use the TensorFlow estimator APIs to perform various ML tasks, in a systematic and standardised way.
51 stars 14 forks source link

Unable to deploy model on cloud ml and serve predictions #4

Open Bhavyabhushanyadav opened 6 years ago

Bhavyabhushanyadav commented 6 years ago

Hi, this custom estimator can not be deployed on cloudml

The transition is not clear as to how the notebook can be packaged as a trainer and deploy on cloudml

Also the serving of the model is not clear, cloudml engine prediction api can accept json or csv requests but the code is not clear as to how predictions can be served

https://github.com/ksalama/tf-estimator-tutorials/blob/master/04%20-%20Times%20Series/02.0%20-%20TF%20ARRegressor%20-%20Experiment%20%2B%20CSV.ipynb

ksalama commented 6 years ago

Hello - For packaging, training and deploying to Cloud ML Engine, please refer to the examples in the cloudml-samples.

In addition, I have recently added a cloudml-template to simplify the packaging process. I included to examples to implement custom estimators, one for regression, and the other for classification problems. The template and the examples follow the same code in the tutorials.

Hope you find this useful, and close the issue.

Bhavyabhushanyadav commented 6 years ago

Thanks for the links and I appriciate your great contribution

I have referred them, but ARRegressor prediction is very confusion, and the above link dont explain this

As you have mentioned the provided links are clear and precise in explaining a classifier model or a regressor model but ARRegressor model is very different

CODE1##############################################################

FORECAST_STEPS = [10,50,100,150,200,250,300]

tf.logging.set_verbosity(tf.logging.ERROR)

eval_input_fn = generate_input_fn( file_names =TRAIN_DATA_FILES, mode = tf.estimator.ModeKeys.EVAL )

evaluation = estimator.evaluate(input_fn=eval_input_fn, steps=1)

df_test = pd.read_csv(TEST_DATA_FILE, names=['time_index','value'], header=0) print("Test Dataset Size: {}".format(len(df_test))) print("")

for steps in FORECAST_STEPS:

forecasts = estimator.predict(input_fn=ts.predict_continuation_input_fn(evaluation, steps=steps))
forecasts = tuple(forecasts)[0]

x_next = forecasts['times']

y_next_forecast = forecasts['mean']
y_next_actual = df_test.value[:steps].values

rmse =  compute_rmse(y_next_actual, y_next_forecast)
mae =  compute_mae(y_next_actual, y_next_forecast)

print("Forecast Steps {}: RMSE {} - MAE {}".format(steps,rmse,mae))

print("") print(forecasts.keys()) ###################################################################

CODE2############################################################ import os

saved_model_dir = export_dir +"/"+os.listdir(path=export_dir)[-1]

input_values = df_test.value[:40].values

print(saved_model_dir)

predictor_fn = tf.contrib.predictor.from_saved_model( export_dir = saved_model_dir )

times = np.arange(1,250)

output = predictor_fn( { "model_state_00":[input_values], "model_state_01":input_values.reshape(1,40,1), "times": [times] }

) predictions = list(map(lambda ls: ls[0],output["mean"][0])) ##################################################################

The above mentioned CODE1 and CODE2 correspond to prediction without a savedmodel and prediction using a saved model

CODE1 is easier to comprehend but CODE2 is something which is close to real world use on cloud, and the tough part is its hard to associate with how you pass a JSON request to a model served on CloudML API for future predictions

(also note that export model is only .pbtxt and not in savedmodel.pb format with variables so even the export function needs to be tweaked to make it work on cloud)

Please can you do necessary code changes so we can serve it on cloudml

Currently this is not possible for ARRegressor

Thanks for your time...

Bhavyabhushanyadav commented 6 years ago

@ksalama please kindly look into this and again thanks for your time