urchade / GLiNER

Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from texts) @ NAACL 2024
https://arxiv.org/abs/2311.08526
Apache License 2.0
1.32k stars 116 forks source link

Converting to ONNX still depends on PyTorch #124

Open milosacimovic opened 3 months ago

milosacimovic commented 3 months ago

Is it possible to export to ONNX and run inference without depending on PyTorch?

Ingvarstep commented 3 months ago

Thank you for pointing it out. You need to change a processor to rely on NumPy, plus rewrite a bit of conversion script to use ONNX instead of PyTorch. We will do it shortly, but any contribution from your side that can accelerate it is welcome.

milosacimovic commented 3 months ago

Do you know of any way of exporting the tokenizer into onnx as well, because right now it seems it is using torch as well through transformers. i.e. it's loaded using AutoTokenizer from transformers which relies on torch