Closed humanely closed 6 years ago
Hi Jolly, I guess you probably have found the answer. To make it work, you need to do:
with infer_model.graph.as_default():
loaded_infer_model = model_helper.load_model(
infer_model.model, ckpt_path, sess, "infer")
-Thang
Hi,
I have to keep persistent session for serving. So instead of creating tf.session by "with" statement, I created a under:
But this gives following error (in model_helper.load_model ): Can someone please suggest for how to load explicit session which can be reused?