Closed fuhailin closed 3 years ago
@fuhailin Please provide with all dependencies to run the code shared or if possible share a colab gist with issue reported. I ran the code shared and face this error
@fuhailin Please provide with all dependencies to run the code shared or if possible share a colab gist with issue reported. I ran the code shared and face this error
Updated, please check this colab : https://colab.research.google.com/gist/fuhailin/32e3b834cff5856c53d719fe877414cb/untitled418.ipynb
@fuhailin I cannot download the csv, please share the dataset for us to replicate the issue.
I am able to replicate the issue reported, please find the gist here
Hi there--
You can do inference with your subclassed tf.keras.Model loaded_model
by:
y_pred = loaded_model.call({"age": [[35]], "education": [["Bachelors"]]})
y_pred = loaded_model.call({"age": [[40]], "education": [["Assoc-voc"]]})
or
y_pred = loaded_model.call({"age": [[35], [40]], "education": [["Bachelors"], ["Assoc-voc"]]})
The error message you received stated that it was expecting arguments "{'age': TensorSpec(shape=(None, 1), dtype=tf.int64, name='age'), 'education': TensorSpec(shape=(None, 1), dtype=tf.string, name='education')}", so the shape of your inputs should be of shape (None, 1).
You can check the shape by:
a = tf.constant([35])
b = tf.constant([[35]])
c = tf.constant([[35],[40]])
a.shape # TensorShape([1])
b.shape # TensorShape([1, 1])
c.shape # TensorShape([2, 1])
The None
in the TensorSpec shape parameter means that the batch size is variable.
Closing this issue, but please reopen if I overlooked something. Thanks.
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
You can collect some of this information using our environment capture script You can also obtain the TensorFlow version with:
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"
Describe the current behavior
I am trying to create a custom classification model using Tensorflow2.3 through tf.keras.Model subclassing method, in the subclass model init function, i use tf.feature_column layer to precess features. going through all of above part, i can train, save and reload the Saved_model, but when i use the reload model to do inference, i get the following error:
When I try to create model with tf.Keras.sequential class or without tf.feature_column layer, every thing works fine, so how can I use the reloaded tf.Keras.subclassing model within tf.feature_column layer to do inference? This puzzled me for days.
Describe the expected behavior
tf.Keras.subclassing model within tf.feature_column layer can do inference like sequential model.
Standalone code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem. If possible, please share a link to Colab/Jupyter/any notebook.
Jupyer minimum demo: https://www.kaggle.com/hailinfufu/notebookadf7121b80
Here is a minimal demo to reproduce my problem:
you can replace model = mymodel() with
that will work fine.
After trained and saved the model, i try to load the SavedModel to do predict:
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.