Bug: trouble loading and using tensorflow savedmodel with custom input layer in keras, has been working fine for 2+ years and now currently looking for the solutions with my team #1924
Sure, here's a concise version of your question, ready to send on GitHub:
Title: Trouble Loading and Using a TensorFlow SavedModel with Custom Input Layer in Keras
Body:
Hi everyone,
I'm currently working on a project where I need to load a TensorFlow SavedModel and use it with a custom input layer in Keras. However, I'm running into issues when trying to perform inference with the model. Here’s what I’ve done so far:
Loaded the SavedModel as an inference-only layer:
import tensorflow as tf
from keras.layers import TFSMLayer
from keras import Input
from keras.models import Model
Attempted to use the model for inference:
predictions = model(test_data_float64)
However, I’m encountering issues with the input data type and shape compatibility.
My Questions:
Data Type Compatibility: How can I ensure that the input data is correctly formatted and compatible with the expected input dtype of the TFSMLayer?
Shape Issues: Are there any common pitfalls or best practices when dealing with custom input layers in Keras models that load TensorFlow SavedModels?
Inference with Custom Layers: Is there a better approach to modify the input layer of a pre-trained TensorFlow SavedModel for inference in Keras?
Any guidance or suggestions on how to resolve these issues would be greatly appreciated. Thank you!
Feel free to post this question on GitHub, and it should be ready for others to understand and provide help without any proprietary information being leaked.
If someone could please help me with this i can pay or whatever it takes, am willing to share more code or whatever it is i need to post to find a solution/fix
Bug Description
Bug Reproduction
Code for reproducing the bug:
Data used by the code:
Expected Behavior
Setup Details
Include the details about the versions of:
Additional context
Sure, here's a concise version of your question, ready to send on GitHub:
Title: Trouble Loading and Using a TensorFlow SavedModel with Custom Input Layer in Keras
Body:
Hi everyone,
I'm currently working on a project where I need to load a TensorFlow SavedModel and use it with a custom input layer in Keras. However, I'm running into issues when trying to perform inference with the model. Here’s what I’ve done so far:
Loaded the SavedModel as an inference-only layer: import tensorflow as tf from keras.layers import TFSMLayer from keras import Input from keras.models import Model
base_model = TFSMLayer("path/to/savedmodel", call_endpoint='serving_default')
input_layer = Input(shape=(32,), dtype='float64')
output_layer = base_model(input_layer)
model = Model(inputs=input_layer, outputs=output_layer)
Converted test data to the required float64 dtype: import pandas as pd
test_data = pd.read_csv("path/to/testdata.csv") test_data_float64 = tf.cast(test_data.values, tf.float64)
Attempted to use the model for inference: predictions = model(test_data_float64)
However, I’m encountering issues with the input data type and shape compatibility.
My Questions:
Any guidance or suggestions on how to resolve these issues would be greatly appreciated. Thank you!
Feel free to post this question on GitHub, and it should be ready for others to understand and provide help without any proprietary information being leaked.