Dao-AILab / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
13.88k stars 1.28k forks source link

/root/flash-attention-main/csrc/flash_attn/src/flash_fwd_kernel.h:7:10: fatal error: cute/tensor.hpp: No such file or directory #1013

Open centyuan opened 3 months ago

centyuan commented 3 months ago

Source code installation, what does this error mean above, does anyone know, please help me

flash_error
spongebobpie commented 3 months ago

i encountered same issue, have you solved it?

janEbert commented 3 months ago

setup.py should automatically pull CUTLASS. Maybe you're building on a machine without internet access and this pull fails, but the build script doesn't error out.

To fix manually before building:

# if build in root directory
git submodule update --init csrc/cutlass
# if building in hopper directory
git submodule update --init ../csrc/cutlass