-
### Describe the issue
optimizer the transformer (bert) model, raise this error "assert node is not None"
### To reproduce
I am use this script and datasets [sequence labeling](https://github.c…
-
现在的 runtime 是基于 libtorch
https://github.com/Snowdar/asv-subtools/tree/master/runtime
请问是否有计划提供 onnx 脚本的导出?
-
### Describe the issue
When I load a model using onnxruntime directml (C++), the code will stuck/crash on step of loading onnx model on one of my Windows 10 Laptop(Other machines works well), any clu…
-
**Describe the bug**
This error while converting a model:
```
2024-08-24 23:42:22,711 - INFO - Using tensorflow=2.17.0, onnx=1.16.2, tf2onnx=1.16.1/f85e88
2024-08-24 23:42:22,711 - INFO - Usin…
-
I am using a Roberta based Sentence transformer model for a chatbot. For inference I have exported the model to ONNX. When the max_length of the tokens
-
Even though all requirements are installed this line of code gives below error
utilities py
`import tensorrt as trt`
```
*** Error loading script: trt.py
Traceback (most recent call last):
…
-
### 🐛 Describe the bug
When attempting to export the UDOP model to ONNX from the transformers library, the torch.onnx.export() command fails with a RuntimeError. Below is a minimal example to repro…
-
### Describe the issue
After converting the swin_transformer model under the pytorch framework into an onnx model and enabling dynamic input, the following error occurs:
onnxruntime.capi.onnxrunti…
-
### System information
- Windows 10
- .net 5.0
- ONNX Runtime installed from nugget
- ONNX Runtime version: 1.5.2
- Microsoft.ML version: 1.5.2
- CPU only
### Issue
Even after disposing the …
-
### Describe the issue
For v1.14.0 with openvino_2022.2.0.7713:
```
openvino/ov_interface.cc:54 onnxruntime::openvino_ep::OVExeNetwork onnxruntime::openvino_ep::OVCore::LoadNetwork(std::shared_pt…