Open emwa0490 opened 4 years ago
Do you want to save the entire model for serving, or just to be able to resume the training later?
Thank you for looking into this. I wanted to save the entire model for both serving and resume the training.
Alternatively, you can add a call method to the model.
What would this look like so that the models in https://www.tensorflow.org/recommenders/examples/basic_retrieval and https://www.tensorflow.org/recommenders/examples/basic_ranking can be served in TF Serving?
For retrieval, have a look at the basic and efficient serving tutorials.
For ranking, the training model should be directly usable in serving, as it implements a call
method.
Thanks. Sorry if this is basic, but I have read and run all 8 of the TFRS tutorials and still have the question. If I'm understanding it correctly, the efficient serving tutorial is referring to the two tower matrix factorization models, but I didn't see reference to serving the one with the tf.keras.layers.Dense() neural network layers in it. Presumably one would add code that would call RankingModel() in https://www.tensorflow.org/recommenders/examples/basic_ranking directly.
But in https://www.tensorflow.org/recommenders/examples/basic_ranking, "The full model" is not RankingModel() but MovielensModel(), and
model = MovielensModel() model.compile(optimizer=tf.keras.optimizers.Adagrad(learning_rate=0.1)) model.fit(cached_train, epochs=3) model.evaluate(cached_test, return_dict=True)
are OK, but
model.save(
NotImplementedError: When subclassing the Model
class, you should implement a call
method.
and
model.built gives
False
So what does an added call method to MovielensModel() look like so that its results can be seen via TF Serving?
Or, if the question does not make sense, what is MovielensModel() doing?
Again, sorry to ask here, but I couldn't find an example online that addresses this.
Yes, you're quite right. In the tutorial it makes sense to separate the two to make exposition easier, but you merge RankingModel
and MovielensModel
together, and export the result.
In this specific case, exporting RankingModel
should work for you.
OK thanks! That might be a nice tutorial or section to add in a future release: show merging the models and serving them in TF Serving.
Is it possible to save the entire model (not model.user_model) in https://github.com/tensorflow/recommenders/blob/main/docs/examples/basic_retrieval.ipynb? I tried model.save() and got the error below.
cannot be saved because the input shapes have not been set. Usually, input shapes are automatically determined from calling
.fit()
or.predict()
. To manually set the shapes, callmodel.build(input_shape)
.Thank you!