deepseek-ai / DeepSeek-Coder

DeepSeek Coder: Let the Code Write Itself
https://coder.deepseek.com/
MIT License
6.85k stars 473 forks source link

flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol #19

Closed blap closed 1 year ago

blap commented 1 year ago
from transformers import AutoTokenizer, AutoModelForCausalLM

folder="deepseek-coder-6.7b-instruct"
model_name="deepseek-ai/deepseek-coder-6.7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True).cuda()

Traceback (most recent call last): File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1345, in _get_module return importlib.import_module("." + module_name, self.name) File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 48, in from flash_attn import flash_attn_func, flash_attn_varlen_func File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/flash_attn/init.py", line 3, in from flash_attn.flash_attn_interface import ( File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 8, in import flash_attn_2_cuda as flash_attn_cuda ImportError: /home/blap/AutoGPTQ-env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c107WarningC1ENS_7variantIJNS0_11UserWarningENS0_18DeprecationWarningEEEERKNS_14SourceLocationESsb

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/blap/AutoGPTQ_teste/update_models.py", line 14, in model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True).cuda() File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained model_class = _get_model_class(config, cls._model_mapping) File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 387, in _get_model_class supported_models = model_mapping[type(config)] File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 740, in getitem return self._load_attr_from_module(model_type, model_name) File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 754, in _load_attr_from_module return getattribute_from_module(self._modules[module_name], attr) File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 698, in getattribute_from_module if hasattr(module, attr): File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1335, in getattr module = self._get_module(self._class_to_module[name]) File "/home/blap/AutoGPTQ-env/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1347, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback): /home/blap/AutoGPTQ-env/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c107WarningC1ENS_7variantIJNS0_11UserWarningENS0_18DeprecationWarningEEEERKNS_14SourceLocationESsb

blap commented 1 year ago

To avoid this error I first install the respective python-dev (sudo apt-get install python3.10-dev for python3.10) and install flash-attn with MAX_JOBS=4 pip3 install --no-build-isolation "flash-attn<2"