tobegit3hub / simple_tensorflow_serving

Generic and easy-to-use serving service for machine learning models
https://stfs.readthedocs.io
Apache License 2.0
757 stars 195 forks source link

Will gRPC be supported in future? #77

Open hxyshare opened 4 years ago

hxyshare commented 4 years ago

Thank you for your work and sharing.I want to ask a question, when will this project support grpc?@tobegit3hub

tobegit3hub commented 4 years ago

Actually we has the Python grpc server in this project https://github.com/tobegit3hub/tensorflow_template_application .

But I do not think the Python grpc server is better than the one in C++ which is how TensorFlow Serving works. So we may not add Python grpc server in this project.

But you can implement the simple one like this project does.

hxyshare commented 4 years ago

Actually we has the Python grpc server in this project https://github.com/tobegit3hub/tensorflow_template_application .

But I do not think the Python grpc server is better than the one in C++ which is how TensorFlow Serving works. So we may not add Python grpc server in this project.

But you can implement the simple one like this project does.

Thanks for your reply, I have implemented a grpc version using simple_tensorflow_serving. This is mainly because I want to use the features of custom op, however, tensorflow serving is not very well implemented.

hxyshare commented 4 years ago

After using custom ops, the speed of inference is 1.5 times faster than tensorflow serving. This op is fastertransformer,which implemented by Nvidia. I tried many ways to use it as a custom op in the tensorflow serving but failed. If I can add this op to tensorflow serving, the inference speed should be faster. Have a good weekend.