ROCm / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
142 stars 46 forks source link

Installation error #39

Closed ekazakos closed 1 week ago

ekazakos commented 10 months ago

Hi,

I follow the instructions and my installation fails. You can find my error log in: install_log.txt. Linker complains that there are multiple definitions of the same thing in different files. Upon checking flash-attention/csrc/flash_attn_rocm/src/, I found out that for many files there are two files with identical content and only different names, e.g. device_memory.hip and device_memory_hip.hip, which were created due to two object files being created (device_memory.o and device_memory_hip.o).

Any help would be appreciated.

dejay-vu commented 10 months ago

@ekazakos, Can you try to delete both device_memory.hip and device_memory_hip.hip, and run python setup.py clean? Then run pip install . with the original device_memory.cpp.

amasin2111 commented 5 months ago

Hi, facing a similar issue, after the patch the header files are not generated for the composable kernel, but the code hipifies the .cpp with hipified header requirements. Precisely speaking, the device_memory.hpp requirement is transformed into device_memory_hip.hpp, but due to the patch, we are skipping the creation of these header files

tcgu-amd commented 1 week ago

Hi, @dejay-vu, @amasin2111, thanks for reaching out and sorry for the lack of responses. The issue mentioned should have been fixed in https://github.com/ROCm/flash-attention/blame/641db759ab7168e472909bc9ff1eda4a329de34f/setup.py#L395 which was part of commit https://github.com/ROCm/flash-attention/commit/d8f104e97aae2057842df765816cc1e273f0a380. I will be closing this issue since it is now stale, but please feel free to re-open/post follow ups if the same errors are still being encountered. Thanks!