-
i have cuda 12.4, requires onnxruntime-gpu >= 1.18.1
[OPT](https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements)
while inference-gpu 0.18.1 requires onnxruntime-…
-
### Describe the issue
The following problem occurred when I optimized Babelscape/mrebel-large:
warnings.warn(
Some non-default generation parameters are set in the model config. These should go …
-
I**nference-TensorFlow-Bert-Model-for-High-Performance-in-ONNX-Runtime** ([https://github.com/onnx/tutorials/blob/master/tutorials/Inference-TensorFlow-Bert-Model-for-High-Performance-in-ONNX-Runtime.…
-
### OpenVINO Version
tag 2023.1.0
### Operating System
Windows System
### Device used for inference
AUTO
### Framework
ONNX
### Model used
ResNet50(But doesn't matter)
### Issue description
…
-
When I put this standard instruction for a simple test I have this error:
(simswap) PS C:\simswap> python test_video_swapsingle.py --crop_size 224 --use_mask --name people --Arc_path arcface_model/…
-
### Checklist
- [X] I have searched related issues but cannot get the expected help.
- [X] 2. I have read the [FAQ documentation](https://github.com/open-mmlab/mmdeploy/tree/main/docs/en/faq.md) but …
-
### Describe the issue
impossible to build onnx with vs2022 build tool + cuda 11.6 + cudnn 8.5.0. cuda and cudnn correctly installed, i checkout on 1.13.1 as specified in [here](https://onnxruntime…
-
When i try to run the "create_dist_binary.bat" (after installing pyinstaller), it gives me this error:
```
(venv) C:\Users\darkenlord1\Desktop\BDTM\interrogator_rpc>create_dist_binary.bat
(venv…
-
I tried it with two different videos, but I'm getting this error.
```
15:51:31.1026474 [E:onnxruntime:, sequential_executor.cc:494 onnxruntime::ExecuteKernel] Non-zero status code returned while run…
ghost updated
8 months ago
-
### Problem Description
ROCM6.2 does not support WSL2, running amdgpu-install-y -- usecase=wsl, rocm -- no dkms prompts missing Unable to locate package hsa-runtime-rocr4wsl-amdgpu ,running amdgpu-in…