Isotr0py / SakuraLLM-Notebooks

Notebooks to run SakuraLLM on colab/kaggle
49 stars 4 forks source link

kaggle找不到文件 #1

Closed yunshaochu closed 8 months ago

yunshaochu commented 8 months ago

在kaggle运行这一段时,会报找不到文件的错误:

代码: !python server.py \ --model_name_or_path $MODEL_PATH \ --llama_cpp \ --use_gpu \ --model_version 0.8 \ --trust_remote_code \ --no-auth

错误: Traceback (most recent call last): File "/kaggle/working/Sakura-13B-Galgame/server.py", line 100, in state.init_model(cfg) File "/kaggle/working/Sakura-13B-Galgame/utils/state.py", line 24, in init_model sakura_model = SakuraModel(*args, *kwargs) File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 131, in init (tokenizer, model) = load_model(cfg) File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 64, in load_model from llama_cpp import Llama File "/opt/conda/lib/python3.10/site-packages/llama_cpp/init.py", line 1, in from .llama_cpp import File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 87, in _lib = _load_shared_library(_lib_base_name) File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 76, in _load_shared_library raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}") RuntimeError: Failed to load shared library '/opt/conda/lib/python3.10/site-packages/llama_cpp/libllama.so': libcudart.so.11.0: cannot open shared object file: No such file or directory

而右边的“output”中,只有刚从仓库把代码拉下来时,才能看到/kaggle/working/Sakura-13B-Galgame的文件,过了一段时间就看不到了。

这要怎么解决呢?

yunshaochu commented 8 months ago

控制台完整的输出是这样的: address:https://0a35-35-237-187-103.ngrok-free.app/ [Errno 2] No such file or directory: '$ROOT_PATH/Sakura-13B-Galgame' /kaggle/working/Sakura-13B-Galgame 2023-12-31 12:33:26 3128a61fc9fe main[147] WARNING Auth is disabled! /opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "modelname" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "modelversion" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "modelquant" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_name_orpath" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( 2023-12-31 12:33:26 3128a61fc9fe main[147] INFO Current server config: Server(listen: 127.0.0.1:5000, auth: None:None) 2023-12-31 12:33:26 3128a61fc9fe main[147] INFO Current model config: SakuraModelConfig(model_name_or_path='./models/sakura-13b-lnovel-v0.8-Q4_K_M.gguf', use_gptq_model=False, trust_remote_code=True, text_length=512, llama=False, llama_cpp=True, use_gpu=True, n_gpu_layers=0, model_name=None, model_quant=None, model_version='0.8') 2023-12-31 12:33:30 3128a61fc9fe numexpr.utils[147] INFO NumExpr defaulting to 4 threads. Traceback (most recent call last): File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 74, in _load_shared_library return ctypes.CDLL(str(_lib_path), **cdll_args) File "/opt/conda/lib/python3.10/ctypes/init.py", line 374, in init self._handle = _dlopen(self._name, mode) OSError: libcudart.so.11.0: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/kaggle/working/Sakura-13B-Galgame/server.py", line 100, in state.init_model(cfg) File "/kaggle/working/Sakura-13B-Galgame/utils/state.py", line 24, in init_model sakura_model = SakuraModel(*args, *kwargs) File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 131, in init (tokenizer, model) = load_model(cfg) File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 64, in load_model from llama_cpp import Llama File "/opt/conda/lib/python3.10/site-packages/llama_cpp/init.py", line 1, in from .llama_cpp import File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 87, in _lib = _load_shared_library(_lib_base_name) File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 76, in _load_shared_library raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}") RuntimeError: Failed to load shared library '/opt/conda/lib/python3.10/site-packages/llama_cpp/libllama.so': libcudart.so.11.0: cannot open shared object file: No such file or directory

Isotr0py commented 8 months ago

看上去是环境中没有CUDA,是否已启用Kaggle中的GPU环境?

yunshaochu commented 8 months ago

看上去是环境中没有CUDA,是否已启用Kaggle中的GPU环境?

原来还要启动啊,现在可以了,谢谢!