Closed ndeep27 closed 2 months ago
@ndeep27 TensorFlow Serving's built-in batching functionality currently doesn't directly support sparse tensors. As you pointed out, it relies on tf.concat for dense tensors . Please have a look at the similar issue as discussed here. For any further queries kindly post the issue in the TF serving repository. Thank you!
This issue is stale because it has been open for 7 days with no activity. It will be closed if no further activity occurs. Thank you.
This issue was closed because it has been inactive for 7 days since being marked as stale. Please reopen if you'd like to work on this further.
Issue type
Support
Have you reproduced the bug with TensorFlow Nightly?
Yes
Source
source
TensorFlow version
TF 2.11
Custom code
No
OS platform and distribution
Linux Rhel
Mobile device
No response
Python version
No response
Bazel version
No response
GCC/compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current behavior?
Is there a utility method to batch sparse tensors together? Tensorflow Serving Batching support utilizes the DenseTensor Concat op - https://github.com/tensorflow/serving/blob/67a2dcb2b9c057847616fb5ce54cb8545955d144/tensorflow_serving/batching/batching_session.cc#L577
Is there an equivalent for SparseTensor batching?
Standalone code to reproduce the issue
Relevant log output