Open yaoyaoleY opened 2 weeks ago
is a package.use pip install flash_attn==0.2.8
to install
Thanks But i install it when i come to a problem ' note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects'
how can i solve it ? ![Uploading 截屏2024-06-17 20.57.38.png…]()
Maybe your environment is not suitable. Here is my environment: python:3.11.5 torch:2.0.0 torchvision: 0.15.1 flash-attn:0.2.8 If still has problem in installing flash_attn,try installing manually,see here:https://github.com/Dao-AILab/flash-attention
Thank you very much
环境配置有点麻烦,之前遇到过:https://github.com/openai/consistency_models/issues/57