TinyLLaVA / TinyLLaVA_Factory

A Framework of Small-scale Large Multimodal Models
https://arxiv.org/abs/2402.14289
Apache License 2.0
661 stars 69 forks source link

dependency conflict #126

Open Nengzyue opened 1 month ago

Nengzyue commented 1 month ago

大家没有碰到过依赖库冲突的问题吗,我flash-attn的依赖库老是冲突要怎么解决呀 (tinyllava_factory) bdca@bdca-poweredge-t640:~/ynz/tinyllava/TinyLLaVA_Factory$ python tinyllava/serve/app.py --model-path tinyllava/TinyLLaVA-Phi-2-SigLIP-3.1B [2024-10-17 14:07:05,891] [INFO] [real_accelerator.py:191:get_accelerator] Setting ds_accelerator to cuda (auto detect) Traceback (most recent call last): File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1510, in _get_module return importlib.import_module("." + module_name, self.name) File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/models/phi/modeling_phi.py", line 56, in from flash_attn import flash_attn_func, flash_attn_varlen_func File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn/init.py", line 3, in from flash_attn.flash_attn_interface import ( File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in import flash_attn_2_cuda as flash_attn_cuda ImportError: /home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/serve/app.py", line 20, in from tinyllava.utils import File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/utils/init.py", line 7, in from .eval_utils import File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/utils/eval_utils.py", line 7, in from transformers import StoppingCriteria, PhiForCausalLM File "", line 1075, in _handle_fromlist File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1501, in getattr value = getattr(module, name) File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1500, in getattr module = self._get_module(self._class_to_module[name]) File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1512, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.phi.modeling_phi because of the following error (look up to see its traceback): /home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

ZhangXJ199 commented 1 month ago

可以到 “https://github.com/Dao-AILab/flash-attention/releases” 下载与环境相匹配的flash-attention,然后pip install即可