ROCm / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
135 stars 45 forks source link

Install flash-attention failed #80

Open sdfasfsdfasfasafd opened 1 month ago

sdfasfsdfasfasafd commented 1 month ago

I install the flash-attention and follow this link: https://rocm.blogs.amd.com/artificial-intelligence/flash-attention/README.html

my GPU is gtx1100(7900XTX)

I install it in the docker and the docker's Installation follows this link:https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/wsl/install-pytorch.html

Here is the Error info:

Failed to build flash_attn ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash_attn)

And if I use python setup.py install

Here is the Error info: File "/download/flash-attention/setup.py", line 490, in setup( File "/usr/lib/python3/dist-packages/setuptools/init.py", line 153, in setup return distutils.core.setup(**attrs) File "/usr/lib/python3.10/distutils/core.py", line 148, in setup dist.run_commands() File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands self.run_command(cmd) File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 74, in run self.do_egg_install() File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 116, in do_egg_install self.run_command('bdist_egg') File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 164, in run cmd = self.call_command('install_lib', warn_dir=0) File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 150, in call_command self.run_command(cmdname) File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3/dist-packages/setuptools/command/install_lib.py", line 23, in run self.build() File "/usr/lib/python3.10/distutils/command/install_lib.py", line 109, in build self.run_command('build_ext') File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 79, in run _build_ext.run(self) File "/usr/lib/python3.10/distutils/command/build_ext.py", line 340, in run self.build_extensions() File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 866, in build_extensions build_ext.build_extensions(self) File "/usr/lib/python3.10/distutils/command/build_ext.py", line 449, in build_extensions self._build_extensions_serial() File "/usr/lib/python3.10/distutils/command/build_ext.py", line 474, in _build_extensions_serial self.build_extension(ext) File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 202, in build_extension _build_ext.build_extension(self, ext) File "/usr/lib/python3.10/distutils/command/build_ext.py", line 529, in build_extension objects = self.compiler.compile(sources, File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 679, in unix_wrap_ninja_compile _write_ninja_file_and_compile_objects( File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 1785, in _write_ninja_file_and_compile_objects _run_ninja_build( File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 2121, in _run_ninja_build raise RuntimeError(message) from e RuntimeError: Error compiling objects for extension

evshiron commented 1 month ago

Which branch are you using?

For Navi31, you might want to follow this thread:

Or use a Triton-based implementation.

sdfasfsdfasfasafd commented 1 month ago

Which branch are you using?

For Navi31, you might want to follow this thread:

Or use a Triton-based implementation.

both navi_support and main have tried

evshiron commented 1 month ago

I tried these steps in WSL, and it compiled successfully:

git clone https://github.com/ROCm/flash-attention
cd flash-attention
git checkout howiejay/navi_support

python3 -m venv venv
source venv/bin/activate

pip3 install cmake ninja wheel packaging
# install a working torch, a custom one for wsl is used in my case
pip3 install torch --index-url https://download.pytorch.org/whl/rocm6.1

python3 setup.py bdist_wheel
# and you get a `flash_attn*.whl` in `dist/`
sdfasfsdfasfasafd commented 1 month ago

I tried these steps in WSL, and it compiled successfully:

git clone https://github.com/ROCm/flash-attention
cd flash-attention
git checkout howiejay/navi_support

python3 -m venv venv
source venv/bin/activate

pip3 install cmake ninja wheel packaging
# install a working torch, a custom one for wsl is used in my case
pip3 install torch --index-url https://download.pytorch.org/whl/rocm6.1

python3 setup.py bdist_wheel
# and you get a `flash_attn*.whl` in `dist/`

I tried it too, but also the Error is as the same as I said "use python setup.py install" on the top

evshiron commented 1 month ago

Could you provide a complete log?

sdfasfsdfasfasafd commented 1 month ago

Could you provide a complete log?

install.log

thank you

evshiron commented 1 month ago

Seems like a network problem?

sdfasfsdfasfasafd commented 1 month ago

Seems like a network problem?

thanks a lot. I have change to the best proxy, if still is the network problem, I have no way.o(╥﹏╥)o

evshiron commented 1 month ago
Submodule 'csrc/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'csrc/composable_kernel'
Cloning into '/download/flash-attention/csrc/composable_kernel'...
fatal: unable to access 'https://github.com/ROCm/composable_kernel.git/': Could not resolve host: github.com
fatal: clone of 'https://github.com/ROCm/composable_kernel.git' into submodule path '/download/flash-attention/csrc/composable_kernel' failed

You should fix this.

sdfasfsdfasfasafd commented 1 month ago
python3 setup.py bdist_wheel

also the same error. install1.log

evshiron commented 1 month ago

Run these steps from scratch (new shell and new location):

nktice commented 1 month ago

Run these steps from scratch (new shell and new location):

* [Install flash-attention failed #80 (comment)](https://github.com/ROCm/flash-attention/issues/80#issuecomment-2328438457)

It seems worth noting that flash-attention version is 2.0.4 a version so old that's basically useless for lots of programs, as that one predates a lot of major changes newer software needs.

sancspro commented 1 month ago

Facing the same issue: RuntimeError: Error compiling objects for extension

I think howiejay/navi_support version is not compatible with ROCM 6.2 (or the Pytorch version 2.5.0.dev20240908+rocm6.2) which are what I run now.

Another thing I noticed is ROCM 6.2 needs Pytorch nightly (meant for ROCM 6.2), older Pytorch versions are causing some issues. Pytorch I'm using: pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.2/

sdfasfsdfasfasafd commented 1 month ago

Facing the same issue: RuntimeError: Error compiling objects for extension

I think howiejay/navi_support version is not compatible with ROCM 6.2 (or the Pytorch version 2.5.0.dev20240908+rocm6.2) which are what I run now.

Another thing I noticed is ROCM 6.2 needs Pytorch nightly (meant for ROCM 6.2), older Pytorch versions are causing some issues. Pytorch I'm using: pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.2/

Yes, I firstly try ROCM6.2 and ubuntu 24.4, and I also face the same issue. So I use ROCM6.1 and ubuntu 22.4, but still have problem. And until now, it still does not be solveed