Open justwangweimin opened 1 year ago
I found that there is flash_api.cpp in csrc/flash_attn, but there is no flash_api.obj. so the link.exe can't be executed. how to generate the obj file such as flash_api.obj, flash_bwd_hdim128_bf16_sm80.obj etc. why these cpp files can't be compiled to obj files? I have installed the cmake, vs 2019, setuptools . D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/flash_api.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_fp16_sm80.obj
here is link command from bug log: "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30133\bin\HostX86\x64\link.exe" /EXPORT:PyInit_flash_attn_2_cuda D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/flash_api.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim192_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim192_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim224_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim224_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim256_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim256_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim32_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim32_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim64_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim64_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim96_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim96_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim128_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim128_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim160_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim160_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim192_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim192_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim224_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim224_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim256_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim256_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim32_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim32_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim64_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim64_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim96_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_fwd_hdim96_fp16_sm80.obj /OUT:build\lib.win-amd64-cpython-311\flash_attn_2_cuda.cp311-win_amd64.pyd /IMPLIB:D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn\flash_attn_2_cuda.cp311-win_amd64.lib
@Tri Dao
also have question
I don't have experience on Windows. Cutlass 3.2 is supposed to work on Windows, but maybe we need to do more work on the FlashAttention side to enable Windows support. I don't have bandwidth now to investigate this, lmk if you figure out something.
@tri Dao
兄弟,最后你这个问题是怎么解决的?
I found that there is flash_api.cpp in csrc/flash_attn, but there is no flash_api.obj. so the link.exe can't be executed. how to generate the obj file such as flash_api.obj, flash_bwd_hdim128_bf16_sm80.obj etc. why these cpp files can't be compiled to obj files? I have installed the cmake, vs 2019, setuptools . D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/flash_api.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim128_fp16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_bf16_sm80.obj D:\chatglm2-6b\flash-attention\build\temp.win-amd64-cpython-311\Release\csrc/flash_attn/src/flash_bwd_hdim160_fp16_sm80.obj
我的问题与你一模一样。
回复不到,简单说下。去年flash_att不支持win,只能在linux上安装,现在不知道啥情况
Same problem.
Requirement already satisfied: torch in c:\programdata\anaconda3\lib\site-packages (from flash-attn==2.0.4) (2.1.0.dev20230721+cu121) Requirement already satisfied: einops in c:\programdata\anaconda3\lib\site-packages (from flash-attn==2.0.4) (0.6.1) Requirement already satisfied: packaging in c:\programdata\anaconda3\lib\site-packages (from flash-attn==2.0.4) (23.0) Requirement already satisfied: ninja in c:\programdata\anaconda3\lib\site-packages (from flash-attn==2.0.4) (1.11.1) Requirement already satisfied: filelock in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (3.12.2) Requirement already satisfied: typing-extensions in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (4.7.1) Requirement already satisfied: sympy in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (1.11.1) Requirement already satisfied: networkx in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (3.1) Requirement already satisfied: jinja2 in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (3.1.2) Requirement already satisfied: fsspec in c:\programdata\anaconda3\lib\site-packages (from torch->flash-attn==2.0.4) (2023.3.0) Requirement already satisfied: MarkupSafe>=2.0 in c:\programdata\anaconda3\lib\site-packages (from jinja2->torch->flash-attn==2.0.4) (2.1.1) Requirement already satisfied: mpmath>=0.19 in c:\programdata\anaconda3\lib\site-packages (from sympy->torch->flash-attn==2.0.4) (1.3.0) Building wheels for collected packages: flash-attn Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [68 lines of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects