fabio-sim / LightGlue-ONNX

ONNX-compatible LightGlue: Local Feature Matching at Light Speed. Supports TensorRT, OpenVINO
Apache License 2.0
376 stars 34 forks source link

Nvidia Tx2 is hard to use #29

Closed demonove closed 1 year ago

demonove commented 1 year ago

hi, I have run the demo successfully on PC. But when I temp to use it on Tx2, it nearly makes me collapse .

Opset 16 is required by LightGlue.onnx, but Tx2 can only support onnxruntime 1.11.0, whose opset is till 15. And torch on Tx2 requires python==3.6, which stop me from installing onnxruntime>1.12.0.

Have you got any idea?

fabio-sim commented 1 year ago

Hi @demonove, happy to hear that LightGlue-ONNX is useful for you.

ONNX opset 16 is only required because of SuperPoint's grid sample operation. See https://github.com/fabio-sim/LightGlue-ONNX/issues/19 for more information. DISK-LightGlue supports opset 12+:

I hope you find these models helpful!

demonove commented 1 year ago

Hi @fabio-sim, thanks for the models you've given. I found another question!

When using export.py to turn a onnx mode, it will include sdpa.py which import torch.onnx._constant, torch.onnx._type_utils, torch.onnx.symbolic_helper. All these imports are all offered by torch 2.0, but Tx2 can only use torch 1.10.

Is there any way makes export.py support for torch 1.10?

fabio-sim commented 1 year ago

Hi @demonove

Once you already have the exported ONNX models, you don't need to install PyTorch for inference. You only need ONNXRuntime: https://github.com/fabio-sim/LightGlue-ONNX/blob/e82a1a414fe72eb13d5be9c6b026d3e106bb5421/requirements-onnx.txt#L1-L5

However, if for some reason you are required to export on device, you can ignore sdpa.py ops by changing the following line: https://github.com/fabio-sim/LightGlue-ONNX/blob/e82a1a414fe72eb13d5be9c6b026d3e106bb5421/export.py#L8

- from lightglue_onnx.ops import patch_disk_convolution_mode, register_aten_sdpa
+ from lightglue_onnx.ops.convolution_mode import patch_disk_convolution_mode