-
您好,请问CNN+Transformer部署的思路过程是什么?有参考学习的代码吗?
-
## 🚀 Feature
**Document: https://torchserve-docs.s3-us-west-2.amazonaws.com/docs/torchserve_architecture_v0.pdf**
**NOTE: The above document is a draft and subject to change**
Request to buil…
-
Currently onnx-inference(tcn) delay is about 1.5ms, a month ago, this number was about 1.04ms, don't know why.
server: cpx-3
onnxruntime version: 1.10.0
bigdl-nano==0.14.0b20220118
-
The initial loading of Torch Script models requires a long warm-up time, especially for the first and second calls.
### To Reproduce
```python
import torch
from bigdl.chronos.forecaster import T…
-
the model analyzer is amazing tool. it is very useful when we do some model performance analysis.
As I am deploying triton inference server on Jetson nano and window 10, So I try to install it on my …
-
**Description**
(same issue https://github.com/triton-inference-server/server/issues/3206)
I have a triton model that accepts a binary string. I want to send a wav file, if I do it through the cli…
-
### System Info
Linux devserver-ei 5.4.0-144-generic #161-Ubuntu SMP Fri Feb 3 14:49:04 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
![image](https://github.com/triton-inference-server/tensorrtllm_back…
-
Hi team!
I wanna test my custom python module (model.py) in Python before sending it to the Triton server.
So I have Python interpreter and stand-alone file test.py where I wanna do like this:
```…
denti updated
10 months ago
-
Hello,
I have finished the evaluation of the Episodic Transformer baselines for the TEACh Benchmark Challenge on the valid_seen.
However, one weird thing I found is that our reproduced result is…
-
### Model description
Hello! Thanks for this great work :)
Previously, I implemented [mpnet-rs](https://github.com/NewBornRustacean/mpnet-rs) and found a related issue(feature request) #33
If …