Closed cbqin closed 4 years ago
It's a mix of 2 issues:
Please try:
inference_func(**({k: tf.expand_dims(v, axis=0) for k, v in inputs.items()}))
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@cbqin @mandubian Hi, do you solve this problem, Can you explain about this, I met similar problem. By the way,
loaded = tf.saved_model.load("/content/saved")
inference_func = loaded.signatures["serving_default"]
# is this line necessary ??? why not just use loaded(inputs) when inferencing
for inputs,_ in valid_dataset:
print(inference_func(inputs))
@xiaoyangnihao I have some issue about incompatible shape too, have you solved the error ?
❓ Questions & Help
Hi, l follow the script in readme, train a model and save as tensorflow saved_model format instead of h5 format. When inferencing, I get some problem, I don't know how to feed the inputs to the model. Here is code.
I change the last line code to get a tensorflow saved_model. I get a problem when inferencing.
Then I get:
Has anyone encountered this problem before?