openai / consistency_models

Official repo for consistency models.
MIT License
6.02k stars 409 forks source link

from flash_attn.flash_attention import FlashAttention . Where is FlashAttention?Why can't I import it? #64

Open yaoyaoleY opened 2 weeks ago

yaoyaoleY commented 2 weeks ago
### Tasks
RICKand-MORTY commented 2 weeks ago

is a package.use pip install flash_attn==0.2.8 to install

yaoyaoleY commented 2 weeks ago

Thanks But i install it when i come to a problem ' note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects'

how can i solve it ? ![Uploading 截屏2024-06-17 20.57.38.png…]()

RICKand-MORTY commented 2 weeks ago

Maybe your environment is not suitable. Here is my environment: python:3.11.5 torch:2.0.0 torchvision: 0.15.1 flash-attn:0.2.8 If still has problem in installing flash_attn,try installing manually,see here:https://github.com/Dao-AILab/flash-attention

yaoyaoleY commented 2 weeks ago

Thank you very much

RICKand-MORTY commented 2 weeks ago

环境配置有点麻烦,之前遇到过:https://github.com/openai/consistency_models/issues/57