chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.45k stars 5.48k forks source link

[BUG]baichuan在线api运行错误 #1638

Closed szdengdi closed 1 year ago

szdengdi commented 1 year ago

问题描述 / Problem Description 1、model_config.py里面有"baichuan-api",配置后,启动会报“在线模型 ‘baichuan-api’ 的provider没有正确配置” 2、server_config.py里面没有baichuan-api的端口配置

复现问题的步骤 / Steps to Reproduce 1.model_config.py里面有"baichuan-api",配置后,启动报“ | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置”

  1. 注释掉“provider”,仍然无法启动
  2. server_config.py里面没有baichuan-api的端口配置

预期的结果 / Expected Result baichuan_api运行

实际结果 / Actual Result 启动报错,webui无显示

环境信息 / Environment Information

附加信息 / Additional Information 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 {'api_key': '65eab5fb 7640855c04685d2', 'device': 'cpu', 'host': '127.0.0.1', 'infer_turbo': False, 'model_path': None, 'online_api': True, 'port': 21007, 'provider': 'BaiChuanWorker', 'secret_key': 'iJ0 62yuGAc=', 'version': 'Baichuan2-53B'} 当前Embbedings模型: m3e-base @ cpu ==============================Langchain-Chatchat Configuration==============================

2023-10-02 10:33:07 | INFO | root | 正在启动服务: 2023-10-02 10:33:07 | INFO | root | 如需查看 llm_api 日志,请前往 C:\tmp\Langchain-Chatchat\logs 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:12 | WARNING | root | Sending SIGKILL to {'qianfan-api': , 'zhipu-api': , 'minimax-api': , 'qwen-api': , 'xinghuo-api': , 'fangzhou-api': } Traceback (most recent call last): File "C:\tmp\Langchain-Chatchat\startup.py", line 705, in start_main_server controller_started.wait() # 等待controller启动完成 File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 1093, in wait return self._callmethod('wait', (timeout,)) File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 818, in _callmethod kind, result = conn.recv() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 250, in recv buf = self._recv_bytes() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 305, in _recv_bytes waitres = _winapi.WaitForMultipleObjects( File "C:\tmp\Langchain-Chatchat\startup.py", line 573, in f raise KeyboardInterrupt(f"{signalname} received")

szdengdi commented 1 year ago

1、server\model_workers下的init.py,增加:from .baichuan import BaiChuanWorker 2、configs下的server_config.py,增加: "baichuan-api": { "port": 21007, },

liunux4odoo commented 1 year ago

1、server\model_workers下的init.py,增加:from .baichuan import BaiChuanWorker 2、configs下的server_config.py,增加: "baichuan-api": { "port": 21007, },

感谢反馈。开发版中已修复。当前可以按照 @szdengdi 的办法手动修改一下。

liurr9810 commented 8 months ago

问题描述 / Problem Description 1、model_config.py里面有"baichuan-api",配置后,启动会报“在线模型 ‘baichuan-api’ 的provider没有正确配置” 2、server_config.py里面没有baichuan-api的端口配置

复现问题的步骤 / Steps to Reproduce 1.model_config.py里面有"baichuan-api",配置后,启动报“ | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置” 2. 注释掉“provider”,仍然无法启动 3. server_config.py里面没有baichuan-api的端口配置

预期的结果 / Expected Result baichuan_api运行

实际结果 / Actual Result 启动报错,webui无显示

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:0.2.5
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型(ChatGLM2-6B / Qwen-7B 等):baichuan-api
  • 操作系统及版本 / Operating system and version:windows11
  • Python 版本 / Python version:3.10.9

附加信息 / Additional Information 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 {'api_key': '65eab5fb 7640855c04685d2', 'device': 'cpu', 'host': '127.0.0.1', 'infer_turbo': False, 'model_path': None, 'online_api': True, 'port': 21007, 'provider': 'BaiChuanWorker', 'secret_key': 'iJ0 62yuGAc=', 'version': 'Baichuan2-53B'} 当前Embbedings模型: m3e-base @ cpu ==============================Langchain-Chatchat Configuration==============================

2023-10-02 10:33:07 | INFO | root | 正在启动服务: 2023-10-02 10:33:07 | INFO | root | 如需查看 llm_api 日志,请前往 C:\tmp\Langchain-Chatchat\logs 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:12 | WARNING | root | Sending SIGKILL to {'qianfan-api': , 'zhipu-api': , 'minimax-api': , 'qwen-api': , 'xinghuo-api': , 'fangzhou-api': } Traceback (most recent call last): File "C:\tmp\Langchain-Chatchat\startup.py", line 705, in start_main_server controller_started.wait() # 等待controller启动完成 File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 1093, in wait return self._callmethod('wait', (timeout,)) File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 818, in _callmethod kind, result = conn.recv() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 250, in recv buf = self._recv_bytes() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 305, in _recv_bytes waitres = _winapi.WaitForMultipleObjects( File "C:\tmp\Langchain-Chatchat\startup.py", line 573, in f raise KeyboardInterrupt(f"{signalname} received")

您好,怎么获取的secret key呢?