fabio-sim / LightGlue-ONNX

ONNX-compatible LightGlue: Local Feature Matching at Light Speed. Supports TensorRT, OpenVINO
Apache License 2.0
376 stars 34 forks source link

TensortRT #16

Closed bfialkoff closed 1 year ago

bfialkoff commented 1 year ago

This probably isnt a super helpful but LightGlue removed the einsum dependency https://github.com/cvg/LightGlue/pull/25. Based on the einsum was blocking README Support for TensorRT. Thought I'd bring it up, I couldnt find the original issue which would've been the proper place to bring this up.

fabio-sim commented 1 year ago

Hi @bfialkoff , thanks for the heads-up!

To be precise, I believe the einops dependency was removed, not necessarily the einsum operations in the FastAttention/Attention forward pass here: (The ellipsis ... shorthand is not supported by TensorRT)

https://github.com/fabio-sim/LightGlue-ONNX/blob/17353135055aa868e9d9c7279df86ea50a4caa12/lightglue_onnx/lightglue.py#L75-L76

However, I did try a different code path by changing the following line from FastAttention to FlashAttention so that the einsums are avoided,

https://github.com/fabio-sim/LightGlue-ONNX/blob/17353135055aa868e9d9c7279df86ea50a4caa12/lightglue_onnx/lightglue.py#L151-L153

and tried converting the ONNX model to TensorRT engine using polygraphy, but another error pops up near the end of the conversion.

[E] 10: Could not find any implementation for node {ForeignNode[/extractor_1/Transpose_3.../lightglue/Where_2]}.
[E] 10: [optimizer.cpp::nvinfer1::builder::cgraph::LeafCNode::computeCosts::3869] Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[/extractor_1/Transpose_3.../lightglue/Where_2]}.)

If anyone more knowledgeable in TensorRT has a fix, please feel free to contribute a PR.

fabio-sim commented 1 year ago

I pushed a patch (#17) to support the TensorrtExecutionProvider under ONNXRuntime. I hope it works on your end!