Open mamba824824 opened 8 months ago
Can you try this?
pip uninstall flash-attn
FLASH_ATTENTION_FORCE_BUILD=TRUE pip install flash-attn
I think this issue thread will come in handy: https://github.com/oobabooga/text-generation-webui/issues/4182
@mamba824824
I think this issue thread will come in handy: oobabooga/text-generation-webui#4182
@mamba824824
It works. Thanks. @ayulockin
ImportError: lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in
import flash_attn_2_cuda as flash_attn_cuda
ImportError: /lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
I have tried to change the version of flash-atten, the ImportError still could not be solved. How to solve that. @ayulockin