Open zhangfan-algo opened 8 months ago
I am also having error when trying to use flash_attn_2_cuda I am using cuda 12.2 ; transformer 38.2 ; torch 2.1.2
RuntimeError: Failed to import transformers.models.mistral.modeling_mistral because of the following error (look up to see its traceback): /.local/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS510ScalarTypeEERS2
Traceback (most recent call last):
File "/home/mdabdullah-_al-asad/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1390, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
Me too... I am using cuda 12.2 ; transformer 38.2 ; torch 2.1.2 too...
from exllamav2 import ExLlamaV2Config, ExLlamaV2, ExLlamaV2Cache, \
File "/home/dell/miniconda3/lib/python3.10/site-packages/exllamav2/init.py", line 3, in
Downgrading flash-attn version to 2.3.0 solved my issue
Try this
pip install --no-build-isolation flash-attn==2.3.0
Same issue here
Same issue here, even though use 2.3.0
env :cuda 12.3 pytorch 2.2.2 Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 RuntimeError raise RuntimeError(: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 RuntimeError raise RuntimeError(: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2_RuntimeError : Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 raise RuntimeError( RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 if hasattr(module, attr): File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1373, in getattr module = self._get_module(self._class_to_module[name]) File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1385, in _get_module module = self._get_module(self._class_to_module[name]) File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1385, in _get_module return getattribute_from_module(self._modules[module_name], attr) File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 692, in getattribute_from_module raise RuntimeError( RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 if hasattr(module, attr): File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1373, in getattr raise RuntimeError( RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2 module = self._get_module(self._class_to_module[name]) File "/mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1385, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /mnt/pfs/zhangfan/system/anaconda/envs/swift/lib/python3.10/site-packages/flash_attn-2.5.5-py3.10-linux-x86_64.egg/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: ZN2at4_ops15sum_IntList_out4callERKNS_6TensorEN3c1016OptionalArrayRefIlEEbSt8optionalINS5_10ScalarTypeEERS2