tomaarsen / SpanMarkerNER

SpanMarker for Named Entity Recognition
https://tomaarsen.github.io/SpanMarkerNER/
Apache License 2.0
391 stars 27 forks source link

SpanMarker with ONNX models #26

Open Ulipenitz opened 1 year ago

Ulipenitz commented 1 year ago

Hi @tomaarsen! Is there a ONNX exporter planned? Have you tried using SpanMarker with ONNX models for inference? Would be really curious if you experimented with that already! :-)

tomaarsen commented 1 year ago

Hello!

I have done a very quick experiment to try and export SpanMarker to ONNX, but I got some incomprehensible errors. I don't have the experience with ONNX at the moment to quickly create such an exporter.

polodealvarado commented 1 year ago

Hi @tomaarsen , I would like to collaborate with this issue.

tomaarsen commented 1 year ago

That would be awesome! I'm open to PRs on the matter.

dbuades commented 11 months ago

This would indeed be a nice feature to add. We export all our models to ONNX before deploying and this is unfortunately not currently possible with SpanMarker.

Keep up the good work!

abhayalok commented 11 months ago

@tomaarsen , can you upload ONNX format for Span Marker.

tomaarsen commented 11 months ago

I'm afraid I haven't been able to convert SpanMarker models to ONNX yet.

polodealvarado commented 10 months ago

Hello @tomaarsen . I am independently working on converting span_marker models to the ONNX format and I have started it on a new branch. I would like to share the results to see if we can make progress on it. How would you like to proceed?

tomaarsen commented 10 months ago

Hello!

Awesome! I'd love to get ONNX support for SpanMarker somehow. You can fork the repository and push your branch there. Then, you can open a draft pull request of your branch from your fork into the main branch of this repository, and we'll be able to discuss there, look at results, etc. GitHub actions will then automatically run the tests from your branch to make sure everything is working well. Does that sound good?

polodealvarado commented 10 months ago

Great!

I will push the branch this weekend as soon as I can.

ogencoglu commented 9 months ago

ONNX support would be amazing! One can also quantize the models for further inference speed optimization once the base models are converted to ONNX. It is essentially 5 lines of code from ONNX to quantized ONNX.

ganga7445 commented 6 months ago

@tomaarsen @polodealvarado is ONNX implementation done? how to load the models with onnx for faster inference?can you please help here?