massquantity / LibRecommender

Versatile End-to-End Recommender System
https://librecommender.readthedocs.io/
MIT License
372 stars 65 forks source link

How to get predictions from deepFM saved model using libsering.serialization.tf_saved() #499

Open BhaveshBhansali opened 2 months ago

BhaveshBhansali commented 2 months ago

Hello,

I have saved deepFM model using libsering.serialization.tf_saved(model). I load the model using: loaded_model = tf.saved_model.load('model')

The model predict signature looks like: _SignatureMap({'predict': <ConcreteFunction pruned(dense_values, item_indices, sparse_indices, user_indices) at 0x56E44DA90>})

However, when I am trying to call model using following, it gives me error:

restored_output_tensor = test_model.signatures['predict'](feed_dict) # where feed_dict is dictionary of features.
`TypeError: pruned(dense_values, item_indices, sparse_indices, user_indices) takes 0 positional arguments, got 1.`

Please find screenshot of the same. ![Uploading Screenshot 2024-07-27 at 14.00.10.png…]()

massquantity commented 2 months ago

libsering.serialization.tf_saved is not compatible with tf.saved_model.load. If you want to save and load a model, use Save/Load API.