Closed LiYuanHaooo closed 8 months ago
🤖
你好,@LiYuanHaooo!很高兴再次看到你在这里。希望你过得很好!
根据您提供的错误日志,这个问题可能是由于程序试图访问的文件或目录不存在。在Docker中启动Langchain-Chatchat v0.2.10的上下文中,这可能是由于以下几个原因:
run_model_worker
函数中指定的model_path
不存在。这个路径用于为应用程序加载语言模型。如果路径错误或者模型文件不在指定的位置,可能会导致这个错误。
startup.sh
脚本(在上下文中没有提供)可能试图访问在Docker容器中不存在的文件或目录。这可能是由于本地系统和Docker容器之间的文件结构差异。
Docker容器可能没有访问某些目录或文件的必要权限。
没有确切的错误消息或startup.sh
脚本,很难确定错误的确切原因。然而,您可以尝试以下步骤来调试问题:
run_model_worker
函数中的model_path
,确保模型文件存在于该位置。startup.sh
脚本,确保所有文件和目录路径都是正确的。如果您能提供更多关于startup.sh
脚本的信息,我可能能更好地帮助您解决这个问题。
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
我也有这个问题!
api已经启动成功说明model已经找到了,日志中没打印出什么文件没找到,所以很难定位问题
以下是完整日志
==========
== CUDA ==
==========
CUDA Version 12.1.1
Container image Copyright (c) 2016-2023, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
This container image and its contents are governed by the NVIDIA Deep Learning Container License.
By pulling and using the container, you accept the terms and conditions of this license:
https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license
A copy of this license is made available in this container at /NGC-DL-CONTAINER-LICENSE for your convenience.
2024-01-26 07:24:52,093 - startup.py[line:655] - INFO: 正在启动服务:
2024-01-26 07:24:52,093 - startup.py[line:656] - INFO: 如需查看 llm_api 日志,请前往 /logs
==============================Langchain-Chatchat Configuration==============================
操作系统:Linux-6.1.0-13-amd64-x86_64-with-glibc2.35.
python版本:3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0]
项目版本:v0.2.10
langchain版本:0.0.354. fastchat版本:0.2.35
当前使用的分词器:ChineseRecursiveTextSplitter
当前启动的LLM模型:['chatglm3-6b-32k'] @ cuda
{'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_path': '/model/chatglm3-6b-32k',
'model_path_exists': True,
'port': 20002}
当前Embbedings模型: bge-large-zh-v1.5 @ cuda
==============================Langchain-Chatchat Configuration==============================
/usr/local/lib/python3.10/dist-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: 模型启动功能将于 Langchain-Chatchat 0.3.x重写,支持更多模式和加速启动,0.2.x中相关功能将废弃
warn_deprecated(
2024-01-26 07:24:54 | ERROR | stderr | INFO: Started server process [144]
2024-01-26 07:24:54 | ERROR | stderr | INFO: Waiting for application startup.
2024-01-26 07:24:54 | ERROR | stderr | INFO: Application startup complete.
2024-01-26 07:24:54 | ERROR | stderr | INFO: Uvicorn running on http://0.0.0.0:20000 (Press CTRL+C to quit)
2024-01-26 07:24:54 | INFO | model_worker | Loading the model ['chatglm3-6b-32k'] on worker 6d9e8ddd ...
shards: 0%| | 0/7 [00:00<?, ?it/s]
shards: 14%|█▍ | 1/7 [00:00<00:02, 2.30it/s]
shards: 29%|██▊ | 2/7 [00:00<00:02, 2.12it/s]
shards: 43%|████▎ | 3/7 [00:01<00:01, 2.24it/s]
shards: 57%|█████▋ | 4/7 [00:01<00:01, 2.17it/s]
shards: 71%|███████▏ | 5/7 [00:02<00:00, 2.18it/s]
shards: 86%|████████▌ | 6/7 [00:02<00:00, 2.18it/s]
shards: 100%|██████████| 7/7 [00:02<00:00, 2.60it/s]
shards: 100%|██████████| 7/7 [00:02<00:00, 2.35it/s]
2024-01-26 07:24:57 | ERROR | stderr |
2024-01-26 07:24:59 | INFO | model_worker | Register to controller
INFO: Started server process [232]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:7861 (Press CTRL+C to quit)
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
Traceback (most recent call last):
File "/usr/local/bin/streamlit", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/streamlit/web/cli.py", line 233, in main_run
_main_run(target, args, flag_options=kwargs)
File "/usr/local/lib/python3.10/dist-packages/streamlit/web/cli.py", line 269, in _main_run
bootstrap.run(file, is_hello, args, flag_options)
File "/usr/local/lib/python3.10/dist-packages/streamlit/web/bootstrap.py", line 411, in run
_install_pages_watcher(main_script_path)
File "/usr/local/lib/python3.10/dist-packages/streamlit/web/bootstrap.py", line 386, in _install_pages_watcher
watch_dir(
File "/usr/local/lib/python3.10/dist-packages/streamlit/watcher/path_watcher.py", line 153, in watch_dir
return _watch_path(
File "/usr/local/lib/python3.10/dist-packages/streamlit/watcher/path_watcher.py", line 128, in _watch_path
watcher_class(
File "/usr/local/lib/python3.10/dist-packages/streamlit/watcher/event_based_path_watcher.py", line 92, in __init__
path_watcher.watch_path(
File "/usr/local/lib/python3.10/dist-packages/streamlit/watcher/event_based_path_watcher.py", line 170, in watch_path
folder_handler.watch = self._observer.schedule(
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/api.py", line 301, in schedule
emitter.start()
File "/usr/local/lib/python3.10/dist-packages/watchdog/utils/__init__.py", line 92, in start
self.on_thread_start()
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify.py", line 119, in on_thread_start
self._inotify = InotifyBuffer(path, self.watch.is_recursive)
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify_buffer.py", line 37, in __init__
self._inotify = Inotify(path, recursive)
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify_c.py", line 179, in __init__
self._add_dir_watch(path, recursive, event_mask)
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify_c.py", line 402, in _add_dir_watch
self._add_watch(full_path, mask)
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify_c.py", line 416, in _add_watch
Inotify._raise_error()
File "/usr/local/lib/python3.10/dist-packages/watchdog/observers/inotify_c.py", line 432, in _raise_error
raise OSError(err, os.strerror(err))
FileNotFoundError: [Errno 2] No such file or directory
尝试降低一点streamlit版本?
我将requirements.txt中关于streamlit的版本降到了和0.2.9一致(0.2.9我能正常启动没有报错的),但依旧还是报原来的错
streamlit~=1.29.0
streamlit-option-menu>=0.3.6
streamlit-chatbox==1.1.11
streamlit-modal>=0.1.0
streamlit-aggrid>=0.3.4.post3
watchdog>=3.0.0
这个问题解决啦,原因不知道,只知道是streamlit导致的,在docker中加个WORDIR /xxx
就ok了,在/
根目录下就会报错。
参考: https://discuss.streamlit.io/t/watchdog-error-when-running-streamlit-in-docker/26865/2
我正常本地安装也遇到了这个问题,你们有解决方案了嘛
我正常本地安装也遇到了这个问题,你们有解决方案了嘛
没有使用docker
@Macbook-Specter 命令行输入 streamlit 看看有没有输出
Usage: streamlit [OPTIONS] COMMAND [ARGS]...
Try out a demo with:
$ streamlit hello
Or use the line below to run your own script:
$ streamlit run your_script.py
Options:
--log_level [error|warning|info|debug]
--version Show the version and exit.
--help Show this message and exit.
Commands:
activate Activate Streamlit by entering your email.
cache Manage the Streamlit cache.
config Manage Streamlit's config settings.
docs Show help in browser.
hello Runs the Hello World script.
help Print this help message.
run Run a Python script, piping stderr to Streamlit.
version Print Streamlit's version number.
配置 streamlit 的环境变量
@tx-anin 非常感谢,我已经解决了
@tx-anin 非常感谢,我已经解决了
请问是如何解决的呢,我在debug的时候会报这个错,但是从命令行运行就可以顺利运行
@tx-anin 非常感谢,我已经解决了
请问是如何解决的呢,我在debug的时候会报这个错,但是从命令行运行就可以顺利运行
我也是这样,debug报错,运行没问题!
@Macbook-Specter 命令行输入 streamlit 看看有没有输出
Usage: streamlit [OPTIONS] COMMAND [ARGS]... Try out a demo with: $ streamlit hello Or use the line below to run your own script: $ streamlit run your_script.py Options: --log_level [error|warning|info|debug] --version Show the version and exit. --help Show this message and exit. Commands: activate Activate Streamlit by entering your email. cache Manage the Streamlit cache. config Manage Streamlit's config settings. docs Show help in browser. hello Runs the Hello World script. help Print this help message. run Run a Python script, piping stderr to Streamlit. version Print Streamlit's version number.
配置 streamlit 的环境变量
错误的原因是因为 执行 streamlit 命令启动app的时候,无法找到 streamlit 命令才报的上面的错误;
解决办法:
cmd = ["your_streamlit_path", "run", "webui.py", "--server.address", host, "--server.port", str(port), "--theme.base", "light", "--theme.primaryColor", "#165dff", "--theme.secondaryBackgroundColor", "#f5f5f5", "--theme.textColor", "#000000", ]
@Macbook-Specter 命令行输入 streamlit 看看有没有输出
Usage: streamlit [OPTIONS] COMMAND [ARGS]... Try out a demo with: $ streamlit hello Or use the line below to run your own script: $ streamlit run your_script.py Options: --log_level [error|warning|info|debug] --version Show the version and exit. --help Show this message and exit. Commands: activate Activate Streamlit by entering your email. cache Manage the Streamlit cache. config Manage Streamlit's config settings. docs Show help in browser. hello Runs the Hello World script. help Print this help message. run Run a Python script, piping stderr to Streamlit. version Print Streamlit's version number.
配置 streamlit 的环境变量
错误的原因是因为 执行 streamlit 命令启动app的时候,无法找到 streamlit 命令才报的上面的错误;
解决办法:
- 配置streamlit 环境变量;
- 找到执行streamlit的文件,如startup.py 文件中,构造cmd的地方,直接使用完整的streamlit文件路径
cmd = ["your_streamlit_path", "run", "webui.py", "--server.address", host, "--server.port", str(port), "--theme.base", "light", "--theme.primaryColor", "#165dff", "--theme.secondaryBackgroundColor", "#f5f5f5", "--theme.textColor", "#000000", ]
确实如此,解决问题了
问题描述 / Problem Description
docker中执行
python startup.py a
后,启动日志报错(api成功,但是streamlit报错)环境信息 / Environment Information