-
I finetuned gliner small v2.1 model and created onnx version of the same model using the convert_to_onnx.ipynb exmple code.
When I compared the inference time of both models, the onnx version took 50…
-
I can't use the onnx in Windows/Python.
I get:
DecodeError: Error parsing message with type 'onnx.ModelProto'
Any solutions? even a simple Python model check fails.
I have every library in…
-
### Feature request
I am trying to train off-line RL using decision transformer, convert to .onnx.
```
from pathlib import Path
from transformers.onnx import FeaturesManager
feature = "seq…
-
### Request Description
[ML.Net](https://dotnet.microsoft.com/en-us/apps/machinelearning-ai/ml-dotnet) models which use ONNX in the backend has some operators which are not supported by OpenVINO
L…
-
After running, the error message is as follows:
[ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from D:\hacksider\Deep-Live-Cam-main\Deep-Live-Cam-main\models\inswapper_128_fp16.onnx failed:Pr…
-
### Feature request
PyTorch 2.5 introduces the `torch.onnx.export(..., dynamo=True)` option and an improved export logic for converting models to ONNX. We recommend optimum to leverage this feature…
-
# Bug Report
### Is the issue related to model conversion?
No
### Describe the bug
I don't know if this is considered a bug or an expected behavior.
When I compose two models using `onn…
-
Does quantized models here become quantized models in ONNX after conversion? Can you even convert/export them to ONNX? How about other way around? Can you export a sparse model to ONNX and quantize in…
-
Below is the list of HF CNN fp32 model issues for [Full model list](https://gist.github.com/jinchen62/cdf54ef8ed725fcce9d6fa18ecbfa058). Tests are imported in https://github.com/nod-ai/SHARK-TestSuite…
-
Thank you for your work, the performance of this model is quite good. I would like to deploy and use it. Is there a way to export it to ONNX?