ali-vilab / videocomposer

Official repo for VideoComposer: Compositional Video Synthesis with Motion Controllability
https://videocomposer.github.io
MIT License
887 stars 80 forks source link

cannnot install/import flash_attn #44

Open yukyeongmin opened 5 months ago

yukyeongmin commented 5 months ago

python 3.8.16 cuda 11.3 torch 1.12.0 gpu geforce rtx 3090

with flash-attn 0.2 note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects failed to build when i tried pip install flash-attn==0.2

so i tried with newest version with flash-attn 2.5.6 Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/cvnar1/anaconda3/envs/VideoComposer/lib/python3.8/site-packages/flash_attn/__init__.py", line 3, in <module> from flash_attn.flash_attn_interface import ( File "/home/cvnar1/anaconda3/envs/VideoComposer/lib/python3.8/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module> import flash_attn_2_cuda as flash_attn_cuda ImportError: /home/cvnar1/anaconda3/envs/VideoComposer/lib/python3.8/site-packages/flash_attn_2_cuda.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl8GPUTrace13gpuTraceStateE

please let me know why this error is occured.

cmh1027 commented 4 months ago

try newest version of flash_attn and change mha_flash.py to use "from flash_attn import flash_attn_qkvpacked_func"