tensorflow / serving

A flexible, high-performance serving system for machine learning models
https://www.tensorflow.org/serving
Apache License 2.0
6.18k stars 2.19k forks source link

Support for TensorFlow pluggable devices #2038

Open pksubbarao opened 2 years ago

pksubbarao commented 2 years ago

Describe the problem the feature is intended to solve

TensorFlow's pluggable device architecture offers a plugin mechanism for registering devices with TensorFlow without the need to make changes in TensorFlow code. It provides a set of C API as an ABI-stable way to register a custom device runtime, kernels/ops, graph optimizer and profiler.

With this, developing support for 3rd party custom devices in TensorFlow is greatly simplified. However, its not clear if these plugins can work with TF-Serving. I can find documentations for serving TensorFlow models with custom ops by copying over source into Serving project and building static library for the op. However I couldn't find anything for custom device nor pluggable device for TFServing.

I would appreciate any documentation or instructions for Serving with custom/pluggable 3rd party devices. If this is not currently supported, any information on plans for future support would be helpful.

Thanks

Describe the solution

Pluggable device to be compatible with TFServing

Describe alternatives you've considered

Considered custom ops that could be used to define ops/kernels but lacks graph optimization and memory management.

Additional context

Add any other context or screenshots about the feature request here.

Bug Report

If this is a bug report, please fill out the following form in full:

System information

pksubbarao commented 2 years ago

Hello... Any update on this issue? Thanks