Dao-AILab / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
11.84k stars 1.05k forks source link

linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv #950

Open zui-jiang opened 1 month ago

zui-jiang commented 1 month ago

ENV

ERROR flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv

Heartfirey commented 1 month ago

I got the same error when I try to using 2.5.8, and I solved this by temporarily rolling back to version 2.5.7 of flash_attn.

pip install flash_attn==2.5.7

Hope that helps you.