tensorflow / recommenders

TensorFlow Recommenders is a library for building recommender system models using TensorFlow.
Apache License 2.0
1.84k stars 275 forks source link

Not able to save model for the quickstart example #276

Open YikSanChan opened 3 years ago

YikSanChan commented 3 years ago

Here is the quickstart example: https://www.tensorflow.org/recommenders/examples/quickstart

It works fine, but I am not able to save the model by model.save('my_model') at the end. It throws exception:

WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.MovieLensModel object at 0x7f727eec7690>, because it is not built.
WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.MovieLensModel object at 0x7f727eec7690>, because it is not built.
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-8-5a8adf13c269> in <module>()
----> 1 model.save("my_model")

3 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/saving/saving_utils.py in raise_model_input_error(model)
     96       'set. Usually, input shapes are automatically determined from calling'
     97       ' `.fit()` or `.predict()`. To manually set the shapes, call '
---> 98       '`model.build(input_shape)`.'.format(model))
     99 
    100 

ValueError: Model <__main__.MovieLensModel object at 0x7f727eec7690> cannot be saved because the input shapes have not been set. Usually, input shapes are automatically determined from calling `.fit()` or `.predict()`. To manually set the shapes, call `model.build(input_shape)`.

This is reproducible. See https://colab.research.google.com/drive/1yktbZXJUb__VAqWvrCH1h0xgJt3XR7Wc?usp=sharing

How can I fix this?

tigerneil commented 3 years ago

you may save model.user_model and model.movie_model separately.

tigerneil commented 3 years ago

Colab

save models seperately and recover model to make inference.

YannisPap commented 3 years ago

If you check the model serving section, you can use the "index" as the model.

# Export the query model.
with tempfile.TemporaryDirectory() as tmp:
  path = os.path.join(tmp, "model")

  # Save the index.
  index.save(path)

  # Load it back; can also be done in TensorFlow Serving.
  loaded = tf.keras.models.load_model(path)

  # Pass a user id in, get top predicted movie titles back.
  scores, titles = loaded(["42"])

  print(f"Recommendations: {titles[0][:3]}")
dunnkers commented 2 years ago

Thanks @tigerneil and @YannisPap. I found saving the index would be most convenient:

index = tfrs.layers.factorized_top_k.BruteForce(model.user_model)
index.index_from_dataset(
    movies.batch(100).map(lambda title: (title, model.movie_model(title))))

index.save("saved_model")

❌ But still, I got a ValueError:

Traceback (most recent call last):
  File "/Users/dunnkers/git/recommender/src/train_model/train_model.py", line 91, in train_model
    index.save("saved_model")
  File "./venv/lib/python3/site-packages/keras/engine/training.py", line 2145, in save
    save.save_model(self, filepath, overwrite, include_optimizer, save_format,
  File "./venv/lib/python3/site-packages/keras/saving/save.py", line 149, in save_model
    saved_model_save.save(model, filepath, overwrite, include_optimizer,
  File "./venv/lib/python3/site-packages/keras/saving/saved_model/save.py", line 75, in save
    saving_utils.raise_model_input_error(model)
  File "./venv/lib/python3/site-packages/keras/saving/saving_utils.py", line 84, in raise_model_input_error
    raise ValueError(
ValueError: Model <tensorflow_recommenders.layers.factorized_top_k.BruteForce object at 0x1551f0fa0> cannot 
be saved because the input shapes have not been set. Usually, input shapes are automatically determined 
from calling `.fit()` or `.predict()`. To manually set the shapes, call `model.build(input_shape)`.


Still, TensorFlow did not know the input shapes. This is fixed by running one prediction using index() or index.predict():

index = tfrs.layers.factorized_top_k.BruteForce(model.user_model)
index.index_from_dataset(
    movies.batch(100).map(lambda title: (title, model.movie_model(title))))

index(np.array(["42"]))
index.save("saved_model")

✅ Model saving now works.

riyaj8888 commented 1 year ago

hi ,

i am trying to save index using following ways.

  1. items = user_item_interactions.map(lambda x: x["item-text"] ) index = tfrs.layers.factorized_top_k.BruteForce(model.query_model) index.index_from_dataset( tf.data.Dataset.zip((items.batch(4), user_item_interactions.batch(4).map(model.candidate_model))) ) tf.saved_model.save(index, path)

there is no error during saving model, but during inferencing index files using loaded = tf.saved_model.load(path) -> this is successful. scores, titles = loaded({"userid":tf.constant(["b7861b82-e60c-4506-abea-7c88666bc9ab"])}) -> here getting error print(f"Recommendations: {titles[0][:3]}")

getting error here scores, titles = loaded({"userid":tf.constant(["b7861b82-e60c-4506-abea-7c88666bc9ab"])}) -> here getting error

TypeError: '_UserObject' object is not callable