tensorflow / recommenders

TensorFlow Recommenders is a library for building recommender system models using TensorFlow.
Apache License 2.0
1.84k stars 276 forks source link

[Question] BruteForce Index not working as a saved model #640

Closed ericy51 closed 1 year ago

ericy51 commented 1 year ago

I've successfully created a model with context features for the query tower and I can get recommendations with the following query

index = tfrs.layers.factorized_top_k.BruteForce(model.query_model, k =20)
index.index_from_dataset(
  tf.data.Dataset.zip((product_lookup.batch(100), products.batch(100).map(model.candidate_model)))
)

query = dict(pd.DataFrame({'primary_id': ["unk"]
                           ,'beauty_affinity':[0]
                           ,'sexual_affinity':[0]
                           ,'fashion_affinity':[1]
                           ,'wellness_affinity':[0]
                           ,'home_affinity':[0]}).iloc[0].map(lambda x: tf.expand_dims(x,axis=0)))

_, titles = index(query)
print(f"Recommendations for unk: {titles[0, :20]}")

However when I try to save the model with the same format and query the saved model I get an error

BruteForce = tfrs.layers.factorized_top_k.BruteForce(model.query_model, k =20)
BruteForce.index_from_dataset(
  tf.data.Dataset.zip((product_lookup.batch(100), products.batch(100).map(model.candidate_model)))
)

query = dict(pd.DataFrame({'primary_id': ["am_183827917602"]
                           ,'beauty_affinity':[0.85]
                           ,'sexual_affinity':[0]
                           ,'fashion_affinity':[0.04]
                           ,'wellness_affinity':[0]
                           ,'home_affinity':[0.09]}).iloc[0].map(lambda x: tf.expand_dims(x,axis=0)))

_ = BruteForce(query)

with tempfile.TemporaryDirectory() as tmp:
    path = os.path.join(tmp, "model")
    tf.saved_model.save(
        BruteForce
        ,path
        ,#options=tf.saved_model.SaveOptions(namespace_whitelist=["Scann"])
    )
    loaded = tf.saved_model.load(path)

test_query = dict(pd.DataFrame({'primary_id': ["am_183827"]
                           ,'beauty_affinity':[0.85]
                           ,'sexual_affinity':[0]
                           ,'fashion_affinity':[0.04]
                           ,'wellness_affinity':[0]
                           ,'home_affinity':[0.09]}).iloc[0].map(lambda x: tf.expand_dims(x,axis=0)))

_, titles = loaded(test_query)

print(f"Recommendations for: {titles[0, :20]}")

Error:

ValueError: Could not find matching concrete function to call loaded from the SavedModel. Got:
  Positional arguments (3 total):
    * {'primary_id': <tf.Tensor 'queries_3:0' shape=(1,) dtype=string>, 'beauty_affinity': <tf.Tensor 'queries:0' shape=(1,) dtype=float64>, 'sexual_affinity': <tf.Tensor 'queries_4:0' shape=(1,) dtype=int64>, 'fashion_affinity': <tf.Tensor 'queries_1:0' shape=(1,) dtype=float64>, 'wellness_affinity': <tf.Tensor 'queries_5:0' shape=(1,) dtype=int64>, 'home_affinity': <tf.Tensor 'queries_2:0' shape=(1,) dtype=float64>}
    * None
    * False
  Keyword arguments: {}

 Expected these arguments to match one of the following 4 option(s):

Option 1:
  Positional arguments (3 total):
    * {'beauty_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='beauty_affinity'), 'fashion_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='fashion_affinity'), 'wellness_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='wellness_affinity'), 'primary_id': TensorSpec(shape=(None,), dtype=tf.string, name='primary_id'), 'sexual_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='sexual_affinity'), 'home_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='home_affinity')}
    * None
    * False
  Keyword arguments: {}

Option 2:
  Positional arguments (3 total):
    * {'home_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/home_affinity'), 'sexual_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='queries/sexual_affinity'), 'beauty_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/beauty_affinity'), 'primary_id': TensorSpec(shape=(None,), dtype=tf.string, name='queries/primary_id'), 'wellness_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='queries/wellness_affinity'), 'fashion_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/fashion_affinity')}
    * None
    * False
  Keyword arguments: {}

Option 3:
  Positional arguments (3 total):
    * {'primary_id': TensorSpec(shape=(None,), dtype=tf.string, name='queries/primary_id'), 'wellness_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='queries/wellness_affinity'), 'home_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/home_affinity'), 'sexual_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='queries/sexual_affinity'), 'beauty_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/beauty_affinity'), 'fashion_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='queries/fashion_affinity')}
    * None
    * True
  Keyword arguments: {}

Option 4:
  Positional arguments (3 total):
    * {'wellness_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='wellness_affinity'), 'primary_id': TensorSpec(shape=(None,), dtype=tf.string, name='primary_id'), 'sexual_affinity': TensorSpec(shape=(None,), dtype=tf.int64, name='sexual_affinity'), 'fashion_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='fashion_affinity'), 'beauty_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='beauty_affinity'), 'home_affinity': TensorSpec(shape=(None,), dtype=tf.float32, name='home_affinity')}
    * None
    * True
  Keyword arguments: {}

I'm not sure why the behavior changes once the model is saved?

patrickorlando commented 1 year ago

Hi @ericy51, Try using model.save(path) or tf.keras.models.save_model(model, path) to serialise the model. You should be able to load it with tf.keras.models.load_model(path) or tf.saved_model.load(path)

caesarjuly commented 1 year ago

Hi @ericy51 . I also met this issue. This is a tricky part of TensorFlow. You can refer to my example. You have to make a predict before saving the model, so TF can automatically generate signatures for serving. https://github.com/tensorflow/tensorflow/issues/37439#issuecomment-596916472