Closed martjushev closed 6 years ago
I never tried, if you succeed, please keep me updated.
I managed to make it working. One must convert the model using following code:
import tensorflow as tf
from tensorflow.python.saved_model.simple_save import simple_save
PATH_TO_CKPT = '../tf_model/frozen_inference_graph_face.pb'
detection_graph = tf.Graph()
with detection_graph.as_default():
od_graph_def = tf.GraphDef()
with tf.gfile.GFile(PATH_TO_CKPT, 'rb') as fid:
serialized_graph = fid.read()
od_graph_def.ParseFromString(serialized_graph)
tf.import_graph_def(od_graph_def, name='')
with tf.Session(graph=detection_graph) as sess:
image_tensor = detection_graph.get_tensor_by_name('image_tensor:0')
# Each box represents a part of the image where a particular object was detected.
boxes = detection_graph.get_tensor_by_name('detection_boxes:0')
# Each score represent how level of confidence for each of the objects.
# Score is shown on the result image, together with the class label.
scores = detection_graph.get_tensor_by_name('detection_scores:0')
classes = detection_graph.get_tensor_by_name('detection_classes:0')
num_detections = detection_graph.get_tensor_by_name('num_detections:0')
simple_save(sess,
"../tf_model_export",
inputs={"image_tensor": image_tensor},
outputs={"boxes": boxes, "scores": scores, "classes": classes,
"num_detections": num_detections})
It exports a model that I successfully uploaded to Cloud ML and was able to test it there.
@martjushev , Great job martjushev! Would you please edit the README.md and pull a request for other people?
With thanks and regards!
Yeephycho
I tried to deploy frozen_inference_graph_face.pb to Google CloudML but it failed with the following error message:
Create Version failed. Model validation failed: SavedModel must contain exactly one metagraph with tag: serve For more information on how to export Tensorflow SavedModel, see https://www.tensorflow.org/api_docs/python/tf/saved_model.
Is it possible to convert given model to a format that would be acceptable by CloudML?