OpenGVLab / Ask-Anything

[CVPR2024 Highlight][VideoChatGPT] ChatGPT with video understanding! And many more supported LMs such as miniGPT4, StableLM, and MOSS.
https://vchat.opengvlab.com/
MIT License
3k stars 247 forks source link

The version of flash-attention #91

Closed LiJiaqi96 closed 5 months ago

LiJiaqi96 commented 9 months ago

Hi, when using videochat2 with flash-attention, I encountered an import error that indicates the functions used come from flash-attn v2 rather than 1.0.4 (shown in the requirement.txt).

I tried to install flash-attn v2 with other packages unchanged but failed. Is there any solutions whether I should modify the code or try to install flash-attn v2 with other packages' version? Thanks a lot!

Andy1621 commented 9 months ago

Hi! Thanks for your reply and sorry for the late response. In my current environment, I use flash-attn 2.0.9.

LiJiaqi96 commented 9 months ago

Thanks for your reply! I will try the flash-attn 2.0.9 and hope it could work

LiJiaqi96 commented 7 months ago

The issue has been solved by using flash-attn 2.0.9, but remember to update the function name in the code to those in flash attention v2.