Closed ahakanbaba closed 1 year ago
Closing this per offline discussion that it's certainly possible and welcome. A design sketch, and maintenance/support model before getting into implementation would be great.
I have implemented XGBoost Serving that is a fork of TensorFlow Serving. It supports serving XGBoost models and XGBoost && FM models. You can read the README for more details and try it in a few minutes. If you encounter any problems, please submit an issue or email me directly.
@ahakanbaba,
You can refer Creating a new kind of servable doc to extend TensorFlow Serving with a new kind of servable. Thank you.
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
This issue was closed due to lack of activity after being marked stale for past 7 days.
Would the maintainers be amenable to adding other model platforms (non-tensorflow) under /servables on the open source version ?
Describe the problem the feature is intended to solve
The tensorflow_serving libraries are generic to support any model platform. The servables directory has a working implementation from tensorflow only. This adds friction for adoption of tensorflow/serving for enterprise level ml platforms that need to support several model platforms (xgboost, lightgbm, pytorch, sklearn....). With this feature request, the onboarding of serving of other model platforms could be significantly improved.
Describe the solution
Add new servables under tensorflow_serving and support other model_platforms in the tensorflow_serving binary.
Describe alternatives you've considered
I guess one could fork tensorflow/serving and use the libraries to add support for new model platforms themselves. Without coordination with the upstream project this could result in repetitive work or missed value-add to the general community.
I have also seen some effort to convert Tree models to TF-Graphs and use tensorflow-serving as is. https://github.com/yupbank/tree_to_tensorflow
At kubeflow level one could use other model serving technologies, like Seldon or BentoML (https://www.kubeflow.org/docs/components/serving/) In this ISSUE I am specifically asking about adding support directly to tensorflow/serving.
Additional context
I see several others have asked for this. In this blog post ( sited by the tensorflow serving paper ) the author says:
Several other issues asking for similar requests. https://github.com/tensorflow/serving/issues/768 and https://github.com/tensorflow/serving/issues/637
Relevant questions from stackoverflow https://stackoverflow.com/q/49571655/5771861