-
### š The doc issue
I run the steps in [Getting Started]( https://github.com/pytorch/serve/blob/master/docs/getting_started.mdl). When I run the `torchserve --start --ncs --model-store model_store --ā¦
-
Here is my command: python mmocr/utils/ocr.py demo/test.jpg --det PS_IC15 --recog SAR_CN --imshow --output demo/result.jpg
FileNotFoundError: SARNet: AttnConvertor: [Errno 2] No such file or direcā¦
-
I'm using torchserve for model deployment.
whether can I deploy the KeyBERT by torchserve?
can you give some advice?
-
The variable `light_the_torch._patch.PYTORCH_DISTRIBUTIONS` is no longer aligned with
the wheels hosted by PyTorch. Please replace it with
```py
PYTORCH_DISTRIBUTIONS = {
"pytorch_triton",
"tā¦
-
### š Describe the bug
## What I'm trying to do
In torchserve we can spawn multiple python processes for the same pytorch model to scale inference. We've recently added support for `torch.compilā¦
-
### š Describe the bug
TL;DR a ScriptModule model runs fine for inference, but passing it to `torch.onnx.export` fails with an internal error. @justinchuby @titaiwangms @BowenBao @thiagocrepaldi iā¦
-
### š The feature
fsspec gives the ability to work with remote file systems like S3, GCS or Azure Blog storage with a single unified API
See this recent example for how broadly useful it is httpā¦
-
### š The doc issue
On the page [https://pytorch.org/serve/metrics_api.html#](url) , there are references to the following :
1. ts_inference_latency_microseconds
2. ts_queue_latency_microseconds
ā¦
-
### š The feature
The /ping endpoint should return an error 5xx when the backend is Unhealthy.
### Motivation, pitch
Currently the /ping health check returns HTTP OK always, whether the backeā¦
-
Hi, I am wondering whether I can use TorchServe to deploy a PyTorch FaceNet model. How do I set something that needs to take an image, pass it through the model to get the encodings, and then compare ā¦