junjie18 / CMT

[ICCV 2023] Cross Modal Transformer: Towards Fast and Robust 3D Object Detection
Other
307 stars 34 forks source link

DLL Load failed while importing flash_attn_cuda #22

Open rwilson2CMU opened 1 year ago

rwilson2CMU commented 1 year ago

Hello, Thanks for all your hard work! I saw your previous issue about your specific environment, but running into issues importing flash-attn. flash_attn_cuda isn't in the site-packages after installing flash-attn with pip install flash-attn==0.2.2. I'm running on windows 11 with a 2070 super, with the following cuda driver: nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2020 NVIDIA Corporation Built on Tue_Sep_15_19:12:04_Pacific_Daylight_Time_2020 Cuda compilation tools, release 11.1, V11.1.74 Build cuda_11.1.relgpu_drvr455TC455_06.29069683_0

I get the following error when I run test.py: Traceback (most recent call last): File "tools\test.py", line 296, in main() File "tools\test.py", line 167, in main plg_lib = importlib.import_module(_module_path) File "C:\Users\xande\anaconda3\envs\CMT4\lib\importlib__init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1014, in _gcd_import File "", line 991, in _find_and_load File "", line 975, in _find_and_load_unlocked File "", line 671, in _load_unlocked File "", line 843, in exec_module File "", line 219, in _call_with_frames_removed File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin__init__.py", line 6, in from .models import File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models__init__.py", line 2, in from .detectors import File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models\detectors\init.py", line 1, in from .cmt import CmtDetector File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models\detectors\cmt.py", line 23, in from projects.mmdet3d_plugin.models.utils.grid_mask import GridMask File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models\utils\init__.py", line 2, in from .petr_transformer import * File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models\utils\petr_transformer.py", line 180, in from .attention import FlashMHA File "C:\Users\xande\Documents\MultiModal\CMT-master\tools\projects\mmdet3d_plugin\models\utils\attention.py", line 17, in from flash_attn.flash_attn_interface import flash_attn_unpadded_kvpacked_func File "C:\Users\xande\anaconda3\envs\CMT4\lib\site-packages\flash_attn\flash_attn_interface.py", line 5, in import flash_attn_cuda ImportError: DLL load failed while importing flash_attn_cuda: The specified procedure could not be found.

Thanks for any help you can give!

junjie18 commented 1 year ago

@rwilson2CMU

Hi, You can try flash attention release .whl file here. Or you can directly download the source code and compile it, python setup.py develop

Guo-Yilong commented 1 year ago

@junjie18 flash-attention requires CUDA 11.4 and above,but CMT uses CUDA11.1

junjie18 commented 1 year ago

@Guo-Yilong You can upgrade CUDA or download lower version for flash attn here.