Closed Rishit-dagli closed 3 years ago
Can you try setting the output_shape
parameter when you create the KerasLayer? Eg:
hub.KerasLayer(..., output_shape=[1024])
Note that the batch size is intentionally not part of output_shape.
Can you try setting the
output_shape
parameter when you create the KerasLayer? Eg:
hub.KerasLayer(..., output_shape=[1024])
Note that the batch size is intentionally not part of output_shape.
@MorganR I tried that as well, and it doesn't affect the error.
If the model really always outputs the same size final dimension, then the ideal solution is for the model to be updated with a more accurate signature.
Can you share which model this is so we can debug further and look for other solutions?
Sure thing @MorganR . Here is a minimalistic Colab Notebook of what I'm trying and also includes a direct link to the SavedModel we created in the prequel notebooks (where it seems to work really well as a classifier, we just removed the classification head for this model): https://colab.research.google.com/drive/1EJGzibnljqxFyEeAc15sa5wAVJsVAyji?usp=sharing
Also, until then I tried using a workaround, manually updating the signatures of the model which seemed to work at the moment; here's some minimal code of what I did:
@tf.function()
def my_predict(input):
inputs = {
"my_serving_input": input,
}
return {
"output": tf.reshape(
model.signatures["serving_default"](input)["output"], [-1, 1024]
)
}
my_signatures = my_predict.get_concrete_function(
input=tf.TensorSpec([None, 384, 384, 3], dtype=tf.dtypes.float32, name="input")
)
tf.saved_model.save(model, export_dir=saved_model_dir, signatures=my_signatures)
I don't have bandwidth to look into this at the moment, so unassigning.
The model in question doesn't seem to be hosted on tfhub.dev. The ideal solution as mentioned by MorganR is to update how the model was exported to fix the dimensions of the output tensor. If this is not possible, then calling tf.reshape() to fix up the dimensions makes sense. This what one would do when using TF directly (w/o Keras). There might be some nicer Keras specific ways of achieving this. One option is to ask a question on the TF/Keras's forums.
Got it, thanks @akhorlin . I will continue using tf.reshape() thanks for confirming. I guess a nice Keras way to do the same would be to add a Reshape layer after the hub layer, but I will try exploring a bit more.
I was trying to fine-tune a SavedModel, here's some minimal code on how I create a model to go about fine-tuning:
Here are the signatures for my SavedModel:
This always returns outputs as (-1, 1024) but having a dynamic shape in the last axis seems to create problems with adding a Dense layer thus giving an error:
Could I modify this hub layer to not use dynamic outputs in the last axes which I think might fix the problem or would there be a better way?
cc: @sayakpaul