chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.96k stars 5.56k forks source link

[BUG] 初始化数据库失败,并且对话时报错RemoteProtocolError #4472

Closed misaka100001 closed 4 months ago

misaka100001 commented 4 months ago

问题描述 执行初始化知识库chatchat-kb -r时,先是报错: urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) 接着显示: raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 随后执行运行步骤chatchat -a时,与ai对话时,ai会显示:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

复现问题的步骤

  1. 通过xinference-local --host 127.0.0.1 --port 9997来运行Xinference,并且Language Models设置为chatglm3,Embedding Models设置为bge-large-zh-v1.5
  2. 配置模型:chatchat-config model --default_llm_model chatglm3
  3. 在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\configs_model_config.py文件里面进行以下修改: self.MODEL_PLATFORMS = [ { ……

    没用openai,这一段就没改

        },
        {
            "platform_name": "xinference",
            "platform_type": "xinference",
            "api_base_url": "http://127.0.0.1:9997/v1",
            "api_key": "EMPT",
            "api_concurrencies": 5,
            "llm_models": [
                "chatglm3",
            ],
            "embed_models": [
                "bge-large-zh-v1.5",
            ],
            "image_models": [],
            "reranking_models": [],
            "speech2text_models": [],
            "tts_models": [],
        },
    ]

    随后在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py进行以下修改: def get_Embeddings( …… ) -> Embeddings: from langchain_community.embeddings import OllamaEmbeddings, XinferenceEmbeddings from langchain_openai import OpenAIEmbeddings from chatchat.server.localai_embeddings import ( LocalAIEmbeddings, ) return XinferenceEmbeddings( server_url="http://127.0.0.1:9997", model_uid="my-bge-large-zh" ) model_info = get_model_info(model_name=embed_model) params = dict(model=embed_model) …… )

  4. 执行chatchat-kb -r,报错“RuntimeError: 向量库 samples 加载失败。”
  5. 运行chatchat -a,和ai对话时,ai报错“RemoteProtocolError”

预期的结果 执行chatchat-kb -r后,应当正常初始化知识库 执行chatchat -a后,应当正常与ai对话

实际结果 (1)执行chatchat-kb -r后,出现以下报错内容: Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 61, in new_vector_store embeddings = get_Embeddings(embed_model=embed_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py", line 234, in get_Embeddings return XinferenceEmbeddings( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\langchain_community\embeddings\xinference.py", line 93, in init self.client = RESTfulClient(server_url) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 738, in init self._check_cluster_authenticated() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 756, in _check_cluster_authenticated response = requests.get(url) ^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 73, in get return request("get", url, params=params, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 59, in request return session.request(method=method, url=url, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 157, in folder2db kb.create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。

(2)执行chatchat -a后,与ai对话时,出现以下报错内容: RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) Traceback: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 112, in iter with map_httpcore_exceptions(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc

环境信息

ASan1527 commented 4 months ago

问题描述 执行初始化知识库chatchat-kb -r时,先是报错: urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) 接着显示: raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 随后执行运行步骤chatchat -a时,与ai对话时,ai会显示:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

复现问题的步骤

  1. 通过xinference-local --host 127.0.0.1 --port 9997来运行Xinference,并且Language Models设置为chatglm3,Embedding Models设置为bge-large-zh-v1.5
  2. 配置模型:chatchat-config model --default_llm_model chatglm3
  3. 在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\configs_model_config.py文件里面进行以下修改: self.MODEL_PLATFORMS = [ { ……

    没用openai,这一段就没改

    }, { "platform_name": "xinference", "platform_type": "xinference", "api_base_url": "http://127.0.0.1:9997/v1", "api_key": "EMPT", "api_concurrencies": 5, "llm_models": [ "chatglm3", ], "embed_models": [ "bge-large-zh-v1.5", ], "image_models": [], "reranking_models": [], "speech2text_models": [], "tts_models": [], }, ] 随后在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py进行以下修改: def get_Embeddings( …… ) -> Embeddings: from langchain_community.embeddings import OllamaEmbeddings, XinferenceEmbeddings from langchain_openai import OpenAIEmbeddings from chatchat.server.localai_embeddings import ( LocalAIEmbeddings, ) return XinferenceEmbeddings( server_url="http://127.0.0.1:9997", model_uid="my-bge-large-zh" ) model_info = get_model_info(model_name=embed_model) params = dict(model=embed_model) …… )

  4. 执行chatchat-kb -r,报错“RuntimeError: 向量库 samples 加载失败。”
  5. 运行chatchat -a,和ai对话时,ai报错“RemoteProtocolError”

预期的结果 执行chatchat-kb -r后,应当正常初始化知识库 执行chatchat -a后,应当正常与ai对话

实际结果 执行chatchat-kb -r后,出现以下报错内容: Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 61, in new_vector_store embeddings = get_Embeddings(embed_model=embed_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py", line 234, in get_Embeddings return XinferenceEmbeddings( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\langchain_community\embeddings\xinference.py", line 93, in init self.client = RESTfulClient(server_url) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 738, in init self._check_cluster_authenticated() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 756, in _check_cluster_authenticated response = requests.get(url) ^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 73, in get return request("get", url, params=params, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 59, in request return session.request(method=method, url=url, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 157, in folder2db kb.create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 执行chatchat -a后,与ai对话时,出现以下报错内容: RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) Traceback: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 112, in iter with map_httpcore_exceptions(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc

环境信息

  • langchain-ChatGLM 版本/commit 号:我不知道(直接使用pip install langchain-chatchat -U安装的)
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型(ChatGLM2-6B / Qwen-7B 等):chatglm3
  • 使用的 Embedding 模型(moka-ai/m3e-base 等):bge-large-zh-v1.5
  • 使用的向量库类型 (faiss / milvus / pg_vector 等): 未使用向量库
  • 操作系统及版本 / Operating system and version: windows11
  • Python 版本 / Python version: python 3.11

哥你解决了吗

misaka100001 commented 4 months ago

问题描述 执行初始化知识库chatchat-kb -r时,先是报错: urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) 接着显示: raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 随后执行运行步骤chatchat -a时,与ai对话时,ai会显示:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) 复现问题的步骤

  1. 通过xinference-local --host 127.0.0.1 --port 9997来运行Xinference,并且Language Models设置为chatglm3,Embedding Models设置为bge-large-zh-v1.5
  2. 配置模型:chatchat-config model --default_llm_model chatglm3
  3. 在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\configs_model_config.py文件里面进行以下修改: self.MODEL_PLATFORMS = [ { ……

    没用openai,这一段就没改

    }, { "platform_name": "xinference", "platform_type": "xinference", "api_base_url": "http://127.0.0.1:9997/v1", "api_key": "EMPT", "api_concurrencies": 5, "llm_models": [ "chatglm3", ], "embed_models": [ "bge-large-zh-v1.5", ], "image_models": [], "reranking_models": [], "speech2text_models": [], "tts_models": [], }, ] 随后在C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py进行以下修改: def get_Embeddings( …… ) -> Embeddings: from langchain_community.embeddings import OllamaEmbeddings, XinferenceEmbeddings from langchain_openai import OpenAIEmbeddings from chatchat.server.localai_embeddings import ( LocalAIEmbeddings, ) return XinferenceEmbeddings( server_url="http://127.0.0.1:9997", model_uid="my-bge-large-zh" ) model_info = get_model_info(model_name=embed_model) params = dict(model=embed_model) …… )

  4. 执行chatchat-kb -r,报错“RuntimeError: 向量库 samples 加载失败。”
  5. 运行chatchat -a,和ai对话时,ai报错“RemoteProtocolError”

预期的结果 执行chatchat-kb -r后,应当正常初始化知识库 执行chatchat -a后,应当正常与ai对话 实际结果 执行chatchat-kb -r后,出现以下报错内容: Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 61, in new_vector_store embeddings = get_Embeddings(embed_model=embed_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\utils.py", line 234, in get_Embeddings return XinferenceEmbeddings( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\langchain_community\embeddings\xinference.py", line 93, in init self.client = RESTfulClient(server_url) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 738, in init self._check_cluster_authenticated() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\xinference\client\restful\restful_client.py", line 756, in _check_cluster_authenticated response = requests.get(url) ^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 73, in get return request("get", url, params=params, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\api.py", line 59, in request return session.request(method=method, url=url, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\requests\adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=9997): Max retries exceeded with url: /v1/cluster/auth (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001A922823990>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 157, in folder2db kb.create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 执行chatchat -a后,与ai对话时,出现以下报错内容: RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) Traceback: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 112, in iter with map_httpcore_exceptions(): File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\MISAKA\Anaconda3\envs\pytorch\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc 环境信息

  • langchain-ChatGLM 版本/commit 号:我不知道(直接使用pip install langchain-chatchat -U安装的)
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型(ChatGLM2-6B / Qwen-7B 等):chatglm3
  • 使用的 Embedding 模型(moka-ai/m3e-base 等):bge-large-zh-v1.5
  • 使用的向量库类型 (faiss / milvus / pg_vector 等): 未使用向量库
  • 操作系统及版本 / Operating system and version: windows11
  • Python 版本 / Python version: python 3.11

哥你解决了吗

没呢,我还指望别的大佬解答呢

ASan1527 commented 4 months ago

哥你解决了吗

没呢,我还指望别的大佬解答呢

哥,你不加知识库时候能对话吗

misaka100001 commented 4 months ago

哥你解决了吗

没呢,我还指望别的大佬解答呢

哥,你不加知识库时候能对话吗

我都直接报错了,你问我能不能对话?

liunux4odoo commented 4 months ago

0.3.1 版已经发布,优化了配置方式,修改配置项无需重启服务器,可以更新尝试。