ayulockin / neurips-llm-efficiency-challenge

Starter pack for NeurIPS LLM Efficiency Challenge 2023.
https://llm-efficiency-challenge.github.io/challenge
Apache License 2.0
118 stars 44 forks source link

Issue about flash-atten #12

Open mamba824824 opened 8 months ago

mamba824824 commented 8 months ago

ImportError: lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in import flash_attn_2_cuda as flash_attn_cuda ImportError: /lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

I have tried to change the version of flash-atten, the ImportError still could not be solved. How to solve that. @ayulockin

ayulockin commented 8 months ago

Can you try this?

pip uninstall flash-attn
FLASH_ATTENTION_FORCE_BUILD=TRUE pip install flash-attn
ayulockin commented 8 months ago

I think this issue thread will come in handy: https://github.com/oobabooga/text-generation-webui/issues/4182

@mamba824824

mamba824824 commented 8 months ago

I think this issue thread will come in handy: oobabooga/text-generation-webui#4182

@mamba824824

It works. Thanks. @ayulockin