-
### System Info
**Description**
I am experiencing an issue when using the transformers library version 4.36.1 with a custom model serving endpoint that utilizes mlflow. The model is based on the Res…
-
I'm a SWE at LinkedIn ML infra. In fact, our team is investigating if we can somehow adopt Triton Server in our use of GPU.
We have one question regarding to the dynamic batching capability of Triton…
-
hi
-
When I use TF-Serving with batching options and variable length inputs, I could get errors like `Tensors with name 'example_feature:0' from different tasks have different shapes and padding is turned …
-
I am working on making this compatible with tf-serving ? How do I set the config variables... tf-serving only takes tensors, right but how do you update scalars ?
-
hello,
Where are the tensors used for serving declared while the model creation?
If this is not the entire code for seq2seq, Could you please post the entire code ?
Regards
Nidhi
-
I have modified the TPUEstimator to use the [BestExporter](https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/estimator/BestExporter), so with features spec and examples:
```python
def s…
-
Models trained by this pipeline perform great. But how to host them using Tesorflow Model Serving? Checkpoint needs to be converted into SavedModel (.pb) format.
What I've done so far is:
1. I'…
-
If we export the model for as the part of the tensorflow serving, how should the input is feed to get the ouptut.
As of the input is fed as a .bin format. But for the serving input should be in th…
-
将图片文字方向模型部署为serving服务。
模型参考文档:https://github.com/PaddlePaddle/PaddleClas/blob/release/2.5/docs/zh_CN/models/PULC/PULC_text_image_orientation.md
对方向分类模型进行了微调,inference模型预测正常。
部署参考文档:https://github.c…