-
## background
My question is about executing encoder-decoder models with onnx genai runtime. My goal is to convert the DONUT transformer https://arxiv.org/abs/2111.15664, a sequence-to-sequence tra…
-
**Image I'm using:**
OS Image
Bottlerocket OS 1.27.1 (aws-k8s-1.30-nvidia)
Kernel version
6.1.115
Container runtime
containerd://1.7.22+bottlerocket
Kubelet version
v1.30.4-eks-16b398d
…
-
### Describe the issue
Running a number of different models from Hugging Face Hub with the `CoreMLExecutionProvider` on Mac arm64 outputs multiple instances of:
```text
Context leak detected, msg…
-
### Describe the issue
When creating ort::Session for most of our models using TensorRT we get this message on the Warning level. When we use CPU,DML or CUDA providers we don't get any error and the …
-
I am trying to build this extension library using Bazel. However, for whatever reason, I seem to be getting issues related to obtaining the actual ONNX runtime:
```
-- The C compiler identificati…
vymao updated
11 months ago
-
### Is your feature request related to a problem?
Integrating ONNX Runtime into SurrealDB enables seamless execution of SurrealML models directly within the database environment, improving performanc…
-
### Description
I'm trying to build OpenVINO as a static library along with any other dependencies due to the nature of my project. Unfortunately, TBB cannot be built statically, so I hope to use …
-
### Describe the issue
ONNX Runtime will throw an error if the given ONNX model has a newer IR version which is not supported by the onnxruntime package yet:
https://github.com/microsoft/onnxruntime…
-
### System Info
Hi,
i did a test between onnx optimum export + ORTOptimizer inference vs. setfit.export_onnx + onnxruntime.InferenceSession.
it seems that onnx optimum ORTOptimizer inference …
-
转换脚本:
```python
#!/usr/bin/env python
# coding: utf-8
from rknn.api import RKNN
from sys import exit
rknn = RKNN(verbose=True)
ONNX_MODEL="RWKV-x060-World-1B6-v2.1-20240328-ctx4096.onnx"
R…