fabio-sim / LightGlue-ONNX

ONNX-compatible LightGlue: Local Feature Matching at Light Speed. Supports TensorRT, OpenVINO
Apache License 2.0
376 stars 34 forks source link

Exporting the operator 'aten::scaled_dot_product_attention' to ONNX opset version 17 is not supported. #45

Closed Albert337 closed 12 months ago

Albert337 commented 1 year ago

hi,first of all,thanks for sharing the project for us! but when i try to run the "export.py" code,i get the error torch.onnx.errors.UnsupportedOperatorError: Exporting the operator 'aten::scaled_dot_product_attention' to ONNX opset version 17 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issues. i referred to the issue #19 and #32,and my environment is WSL2 Ubuntu 22.04 Python 3.10.13 CUDA 11.8 CUDNN 8.9.4 TensorRT 8.6.1 numpy==1.24.1 onnxruntime-gpu==1.16.0 opencv-python==4.8.0.76 matplotlib==3.8.0, i tried to change opeset value,but it did not work. is there any suggestions ?thanks

fabio-sim commented 1 year ago

Hi @Albert337, thank you for your interest in LightGlue-ONNX.

What's your PyTorch version? Version 2.1.0 is required. Please refer to requirements-export.txt: https://github.com/fabio-sim/LightGlue-ONNX/blob/85d7050930cbfc72ce9e806bced05eab6b56732c/requirements-export.txt#L1-L8

Kanan99 commented 1 year ago

Can the model be exported using Torch version 2.1.0 and subsequently utilize the TRT engine on a different version of CUDA and Torch on the same device?

fabio-sim commented 1 year ago

Hi @Kanan99, thank you for your interest in LightGlue-ONNX.

Yes, that's the whole point of exporting to ONNX - platform interoperability - so that you can avoid having to install PyTorch as a dependency.