Isotr0py / SakuraLLM-Notebooks

Notebooks to run SakuraLLM on colab
52 stars 6 forks source link

使用Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb出现错误 #11

Closed gmgorag closed 2 months ago

gmgorag commented 2 months ago

在kaggle中使用T4×2,运行Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb脚本时,在出现“INFO loading model ...”后紧接着抛出异常如下:


  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
    return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
  File "/opt/conda/lib/python3.10/ctypes/__init__.py", line 374, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/kaggle/working/Sakura-13B-Galgame/server.py", line 108, in <module>
    state.init_model(cfg)
  File "/kaggle/working/Sakura-13B-Galgame/utils/state.py", line 24, in init_model
    sakura_model = SakuraModel(*args, **kwargs)
  File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 144, in __init__
    (tokenizer, model) = load_model(cfg)
  File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 109, in load_model
    from infers.llama import LlamaCpp
  File "/kaggle/working/Sakura-13B-Galgame/infers/llama.py", line 5, in <module>
    from llama_cpp import Llama
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in <module>
    _lib = _load_shared_library(_lib_base_name)
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
    raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
Isotr0py commented 2 months ago

可能是因为 llama-cpp-python 更新了 CI 的编译环境到ubuntu-latest:

kaggle ubuntu20.04 的 glibc 版本有点老,只支持到 GLIBC_2.31

晚点我编译一个 kaggle 环境能用的 wheel (可能得过几天,下周有点事不一定有空)

Isotr0py commented 2 months ago

已更新 llama-cpp-python 的 indexl-url,现在应该可以跑了。