-
**Describe the bug**
I am trying to build ONNXRuntime v1.8.0 on RPi Zero. During building I get a pthread linking error. Here is the full stack trace
```
pi@raspberrypi:~/onnxruntime $ bash build…
-
### Describe the feature request
I'm running into some problems with the windows-provided DirectML install (in C:/windows/system32/DirectML.dll) conflicting with the DirectML.dll version that onnxrun…
-
### Describe the issue
there is no ```source distribution``` in pypi and unable to build the source distribution tried to build using ``` python -m build --sdist```
``` Necessity ``` : Trying to…
-
### Describe the issue
![image](https://user-images.githubusercontent.com/39560035/189062446-7db83c6a-a1d7-44a3-93ba-d7cf5022b1e1.png)
Excuse me I've been using the tutorial to install ONNX and have…
-
### Describe the issue
Unable to build `onnxruntime` on Debian Bookworm container for `armv7`.
### Urgency
_No response_
### Target platform
armv7
### Build script
`Dockerfile-debian.arm32v7`
…
-
##### System information (version)
- OpenVINO=> 2022.2
- Operating System / Platform => Windows 64 Bit
- Compiler => Visual Studio 2019
- Problem classification => OpenVINO load issue
##### Det…
-
### Describe the issue
I'm trying to build onnxruntime on a Radxa-Zero, but I've come to find out that it does not support BFLOAT16 instructions. As a result your builds fail because they require tho…
-
I want to use the GPU for inference on an C# UWP app using the latest Windows ML NuGet package 1.5.1.
However, currently only CPU inference is supported. I tried to use the ONNX runtime directly with…
-
**Describe the bug**
When trying to build onnxruntime with the Vitis-AI runtime according to the build instructions at https://onnxruntime.ai/docs/reference/execution-providers/Vitis-AI-ExecutionProv…
-
### Describe the issue
I cannot compile onnxruntime (onnxruntime even with master).
I began with cmake :
```
cmake /c/lib/onnxruntime/cmake -DCMAKE_INSTALL_PREFIX=c:/install/onnxruntime \
-D…