Open zshancock opened 3 years ago
Additionally, I tried putting this model on AI Platform (framework 2.3) and making the call with this payload (worked for Mask RCNN of similar configuration in TF 1.15 framework.
{"instances":
[
{"inputs": {"b64": encoded_image}
}
]
}
And that call error'd with:
"Failed to process element: 0 key: inputs of 'instances' list. Error: Invalid argument: JSON object: does not have named input: inputs"
}">
Additionally, I tried putting this model on AI Platform (framework 2.3) and making the call with this payload (worked for Mask RCNN of similar configuration in TF 1.15 framework.
{"instances": [ {"inputs": {"b64": encoded_image} } ] }
And that call error'd with:
"Failed to process element: 0 key: inputs of 'instances' list. Error: Invalid argument: JSON object: does not have named input: inputs" }">
I have the same problem when I try to put the maskrcnn from the model zoo on AI Platform models for inference.
Prerequisites
Please answer the following questions for yourself before submitting an issue.
1. The entire URL of the file you are using
https://github.com/tensorflow/models/blob/master/research/object_detection/exporter_main_v2.py
2. Describe the bug
I trained a model with
model_main_tf2.py
then usedexporter_main_v2.py
(with encoded_image_string_tensor) on the checkpoint directory and that generated the expected files (chiefly thesaved_model.pb
). However, when I try to load the model withtf.saved_model.load()
I get unexpected behavior. I cannot seem to figure out how to make the call to the loaded model as it has 200+ inputs of Unknown name, and the outputs do not look correct either. I looked everywhere for signatures but can only find a concrete function.3. Steps to reproduce
mask_rcnn_inception_resnet_tf2.config
model_main_tf2.py
exporter_main_v2.py
withencoded_image_string_tensor
as the input_type.saved_model.pb
with tf.saved_model.load(4. Expected behavior
inputs = base64.b64encode(img_data).decode()
where img_data is a bytes-like representationmodel(inputs)
5. Additional context
I made several attempts to resolve - one error that came up often was ->
(unexpected) The inputs of my loaded model with
model.signatures['serving_default'].inputs
:(unexpected) The outputs with `model.signatures['serving_default'].outputs:
6. System information
I am using the
tensorflow/tensorflow:2.3.0-gpu
Docker Image