Open foreverpiano opened 5 months ago
You may want to try simply MAX_JOBS=8 pip install flash-attn --no-build-isolation
. It seems to be working for me for now.
P.S. 8 JOBS seem to be taking 200GB RAM so you can adjust this parameter accordingly.
P.P.S. Other people are noting this is not working for them. Some more info: I had reinstalled anaconda with python 3.8 and pytorch 2.3.0. Like others (@rkuo2000) are mentioning, only some specific versions are working. You can try them.
Same error. @saurabh-kataria solution didn't work.
@CyberTimon Yes, it can't work
I could "fix" it by updating python to 3.11, but that's not the correct solution
python 3.10.14, Cuda 12.1, Ubuntu22.04.4 LTS torch==2.3.0, flash-attn==2.5.8 works (2.5.9post1 has the same failure)
python 3.10.14, Cuda 12.1, Ubuntu22.04.4 LTS torch==2.3.0, flash-attn==2.5.8 works (2.5.9post1 has the same failure)
Thanks. I try python 3.9.19. torch==2.3.0, flash-attn==2.5.8. It works.
Thanks. Python 3.9.19,cuda12.2,torch2.3.0 flash_attn==2.5.8,it works!
Thanks, flash_attn==2.5.8 works!
flash_attn == 2.5.8 works, thanks
python 3.10.14, Cuda 12.1, Ubuntu22.04.4 LTS torch==2.3.0, flash-attn==2.5.8 works (2.5.9post1 has the same failure)
Hello, how do you install flash-attn2.5.8? I meet such error: @zhangj1an @oceaneLIU
Building wheels for collected packages: flash-attn
Building wheel for flash-attn (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [19 lines of output]
fatal: not a git repository (or any of the parent directories): .git
torch.__version__ = 2.4.1+cu121
/opt/conda/envs/mgm/lib/python3.10/site-packages/setuptools/__init__.py:94: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
!!
********************************************************************************
Requirements should be satisfied by a PEP 517 installer.
If you are using pip, you can try `pip install --use-pep517`.
********************************************************************************
!!
dist.fetch_build_eggs(dist.setup_requires)
running bdist_wheel
Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu122torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
error: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)>
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash-attn
Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)
cuda11.8, python3.11, pytorch==2.3.0, flash_attn==2.5.8 works, thanks for all discussion!
conda install pytorch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0 pytorch-cuda=11.8 -c pytorch -c nvidia
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu118torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
cuda11.8, python3.11, pytorch==2.3.0, flash_attn==2.5.8 works, thanks for all discussion!
conda install pytorch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0 pytorch-cuda=11.8 -c pytorch -c nvidia pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu118torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
Thanks a lot! It works.
cuda11.8, python3.11, pytorch==2.3.0, flash_attn==2.5.8 works, thanks for all discussion!cuda11.8,python3.11,pytorch==2.3.0,flash_attn==2.5.8有效,感谢所有讨论!
conda install pytorch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0 pytorch-cuda=11.8 -c pytorch -c nvidia pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu118torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
It works. Thanks!
I build flash_attn from source code with pytorch.2.3.0
code
my env:
@tridao