-
`git clone` code from branch `main`,
this is cuda home:
i have export env:
```
export ORT_DYLIB_PATH=/opt/app/models/onnxruntime/lib/libonnxruntime.so.1.20.0
export CUDA_HOME=/opt/app/cuda/
…
-
### System Info
transformers.js: 3.0.2
chrome: 130
OS: macos
### Environment/Platform
- [X] Website/web-app
- [ ] Browser extension
- [ ] Server-side (e.g., Node.js, Deno, Bun)
- [ ] Des…
-
We have anecdotal evidence that the Python layer in between libtorch and Postgres calls can increase the cost as much as 4x when doing large batches. In the end state, I think we should be boiling the…
-
### Describe the issue
During the optimization stage of onnxruntime, the batchnorm preceded by Conv operator is being fused with the Conv operator and eliminated. The same process should apply to bat…
-
The question is how do you free memory
https://github.com/triton-inference-server/onnxruntime_backend/issues/103
When the model is deployed to a single card, I can specify real-time release of…
-
### GitHub Tags:
#A11ySev2; #A11yTCS; #DesktopWeb; #SH_ONNX Runtime & Ecosystem_Web_Mar2024; #ONNX Runtime & Ecosystem; #WCP; #Win11; #FTP; #P3_WCP; #ONNX Runtime; #ChromiumEdge; A11yWCAG2.1; #A11yMA…
-
### Feature request
I want to export my model to onnx,but an error was happend, like, xxx is not support. such as, if I want export a blip model "Salesforce/blip-image-captioning-large" from the hug…
-
### Describe the issue
I am trying to encapsulate the torchvision.ops.nms function in an onnx model, there is no problem in the model conversion and inference but the output of the derived onnx mod…
-
I have combined the phoneme sets for all three langauges,
English, Chinese, Japanese and started fine tuning using a datset comprised of all three speech languages
The base model I use is the chine…
-
### Describe the issue
Can anyone successfully use onnx and yolo5?
### To reproduce
Can anyone successfully use onnx and yolo5?
### Urgency
_No response_
### Platform
Windows
### OS Ve…