microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.9k stars 2.95k forks source link

[Documentation] Community blog post contribution #21389

Closed fabio-sim closed 4 months ago

fabio-sim commented 4 months ago

Describe the documentation issue

Hi, apologies if this isn't the correct place for this; I would like to contribute a community blog post Accelerating LightGlue Inference with ONNX Runtime and TensorRT

Page / URL

https://onnxruntime.ai/blogs

sophies927 commented 4 months ago

We can definitely add this to our blogs site - thank you so much for the contribution!

sophies927 commented 4 months ago

Resolved in https://github.com/microsoft/onnxruntime/pull/21445

Added to blogs site: https://onnxruntime.ai/blogs