chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
30.12k stars 5.28k forks source link

RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) #4383

Closed lxx1129z closed 1 week ago

lxx1129z commented 3 weeks ago

问题描述 / Problem Description 启动chatchat没有问题但是在提问的时候会报错:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

预期的结果 / Expected Result 会正确出现响应

实际结果 / Actual Result image

环境信息 / Environment Information

附加信息 / Additional Information cwd:C:\Users\dell usage: chatchat [-h] [-a] [--all-api] [--api] [-w] [-i] chatchat: error: unrecognized arguments: a

(llama-cpp-env) C:\Users\dell>chatchat -a cwd:C:\Users\dell

==============================Langchain-Chatchat Configuration============================== 操作系统:Windows-10-10.0.19045-SP0. python版本:3.9.19 (main, May 6 2024, 20:12:36) [MSC v.1916 64 bit (AMD64)] 项目版本:v0.3.0 langchain版本:0.1.17

当前使用的分词器:ChineseRecursiveTextSplitter 默认选用的 Embedding 名称: custom-embedding ==============================Langchain-Chatchat Configuration==============================

2024-07-03 13:18:01,239 root 76912 INFO 正在启动服务: 2024-07-03 13:18:01,240 root 76912 INFO 如需查看 llm_api 日志,请前往 C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\data\logs C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\pydantic_internal_fields.py:151: UserWarning: Field "modelname" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( INFO: Started server process [60872] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:7861 (Press CTRL+C to quit)

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:8501

INFO: 127.0.0.1:60893 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:11,185 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( INFO: 127.0.0.1:60905 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:17,484 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:60906 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:17,716 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:60907 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:29,828 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:60908 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:35,201 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:60909 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:37,622 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:60910 - "POST /chat/chat/completions HTTP/1.1" 200 OK 2024-07-03 13:18:38,084 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:18:38,102 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f938b85790

During handling of the above exception, another exception occurred:

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) INFO: 127.0.0.1:60914 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:18:48,564 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:61078 - "GET /tools HTTP/1.1" 200 OK 2024-07-03 13:30:49,320 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" INFO: 127.0.0.1:61079 - "POST /chat/chat/completions HTTP/1.1" 200 OK 2024-07-03 13:30:49,777 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:30:49,793 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f939d08790

During handling of the above exception, another exception occurred:

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

zx1159652666 commented 2 weeks ago

您好,请问解决了吗

huanghx11 commented 2 weeks ago

也想问一下最后问题解决了吗?

seallijian commented 2 weeks ago

我是在通过调用Xinference里的Qwen-VL-Chat时报这个错的,原来dialogue里注释掉了文件上传的功能,我打开了注释,并已经上传了图片,但结果就报这个错,如果一般的Qwen-14B-Chat语言模型,纯文字问答没问题,估计还是图片信息没正确传递给接口

nilin1998 commented 1 week ago

问题描述 / Problem Description 启动chatchat没有问题但是在提问的时候会报错:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

预期的结果 / Expected Result 会正确出现响应

实际结果 / Actual Result image

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:v3.0.0
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型:本地私有模型
  • 使用的 Embedding 模型:
  • 操作系统及版本 :Windows
  • Python 版本:
  • 其他相关环境信息 / Other relevant environment information:

附加信息 / Additional Information cwd:C:\Users\dell usage: chatchat [-h] [-a] [--all-api] [--api] [-w] [-i] chatchat: error: unrecognized arguments: a

(llama-cpp-env) C:\Users\dell>chatchat -a cwd:C:\Users\dell

==============================Langchain-Chatchat Configuration============================== 操作系统:Windows-10-10.0.19045-SP0. python版本:3.9.19 (main, May 6 2024, 20:12:36) [MSC v.1916 64 bit (AMD64)] 项目版本:v0.3.0 langchain版本:0.1.17

当前使用的分词器:ChineseRecursiveTextSplitter 默认选用的 Embedding 名称: custom-embedding ==============================Langchain-Chatchat Configuration==============================

2024-07-03 13:18:01,239 root 76912 INFO 正在启动服务: 2024-07-03 13:18:01,240 root 76912 INFO 如需查看 llm_api 日志,请前往 C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\data\logs C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\pydantic_internal_fields.py:151: UserWarning: Field "modelname" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( �[32mINFO�[0m: Started server process [�[36m60872�[0m] �[32mINFO�[0m: Waiting for application startup. �[32mINFO�[0m: Application startup complete. �[32mINFO�[0m: Uvicorn running on �[1mhttp://127.0.0.1:7861�[0m (Press CTRL+C to quit)

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:8501

�[32mINFO�[0m: 127.0.0.1:60893 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:11,185 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( �[32mINFO�[0m: 127.0.0.1:60905 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:17,484 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60906 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:17,716 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60907 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:29,828 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60908 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:35,201 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60909 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:37,622 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60910 - "�[1mPOST /chat/chat/completions HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:38,084 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:18:38,102 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" �[31mERROR�[0m: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f938b85790

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call | return await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\fastapi\applications.py", line 1054, in call | await super().call(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\applications.py", line 123, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 186, in call | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 164, in call | await self.app(scope, receive, _send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\cors.py", line 83, in call | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 758, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 778, in app | await route.handle(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 299, in handle | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 79, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 77, in app | await response(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call | await wrap(partial(self.listen_for_disconnect, receive)) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\anyio_backends_asyncio.py", line 680, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap | await func() | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\server\api_server\openai_routes.py", line 84, in generator | async for chunk in await method(params): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 147, in aiter | async for item in self._iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 174, in stream | raise APIError( | openai.APIError: An error occurred during streaming +------------------------------------ 2024-07-03 13:18:38.136 Uncaught app exception Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 113, in iter for part in self._httpcore_stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 367, in iter raise exc from None File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 363, in iter for part in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 349, in iter raise exc File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 341, in iter for chunk in self._connection._receive_response_body(kwargs): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 210, in _receive_response_body event = self._receive_event(timeout=timeout) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 221, in _receive_event event = self._h11_state.next_event() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) �[32mINFO�[0m: 127.0.0.1:60914 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:48,564 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:61078 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:30:49,320 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:61079 - "�[1mPOST /chat/chat/completions HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:30:49,777 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:30:49,793 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" �[31mERROR�[0m: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f939d08790

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call | return await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\fastapi\applications.py", line 1054, in call | await super().call(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\applications.py", line 123, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 186, in call | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 164, in call | await self.app(scope, receive, _send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\cors.py", line 83, in call | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 758, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 778, in app | await route.handle(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 299, in handle | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 79, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 77, in app | await response(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call | await wrap(partial(self.listen_for_disconnect, receive)) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\anyio_backends_asyncio.py", line 680, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap | await func() | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\server\api_server\openai_routes.py", line 84, in generator | async for chunk in await method(params): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 147, in aiter | async for item in self._iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 174, in stream | raise APIError( | openai.APIError: An error occurred during streaming +------------------------------------ 2024-07-03 13:30:49.818 Uncaught app exception Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 113, in iter for part in self._httpcore_stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 367, in iter raise exc from None File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 363, in iter for part in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 349, in iter raise exc File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 341, in iter for chunk in self._connection._receive_response_body(kwargs): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 210, in _receive_response_body event = self._receive_event(timeout=timeout) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 221, in _receive_event event = self._h11_state.next_event() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

请问您解决了嘛,我遇到和你同样的问题

nilin1998 commented 1 week ago

我是在通过调用Xinference里的Qwen-VL-Chat时报这个错的,原来dialogue里注释掉了文件上传的功能,我打开了注释,并已经上传了图片,但结果就报这个错,如果一般的Qwen-14B-Chat语言模型,纯文字问答没问题,估计还是图片信息没正确

也想问一下最后问题解决了吗?

请问您解决了嘛

ASan1527 commented 1 week ago

问题描述 / Problem Description 启动chatchat没有问题但是在提问的时候会报错:RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

预期的结果 / Expected Result 会正确出现响应

实际结果 / Actual Result image

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:v3.0.0
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型:本地私有模型
  • 使用的 Embedding 模型:
  • 操作系统及版本 :Windows
  • Python 版本:
  • 其他相关环境信息 / Other relevant environment information:

附加信息 / Additional Information cwd:C:\Users\dell usage: chatchat [-h] [-a] [--all-api] [--api] [-w] [-i] chatchat: error: unrecognized arguments: a

(llama-cpp-env) C:\Users\dell>chatchat -a cwd:C:\Users\dell

==============================Langchain-Chatchat Configuration============================== 操作系统:Windows-10-10.0.19045-SP0. python版本:3.9.19 (main, May 6 2024, 20:12:36) [MSC v.1916 64 bit (AMD64)] 项目版本:v0.3.0 langchain版本:0.1.17

当前使用的分词器:ChineseRecursiveTextSplitter 默认选用的 Embedding 名称: custom-embedding ==============================Langchain-Chatchat Configuration==============================

2024-07-03 13:18:01,239 root 76912 INFO 正在启动服务: 2024-07-03 13:18:01,240 root 76912 INFO 如需查看 llm_api 日志,请前往 C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\data\logs C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\pydantic_internal_fields.py:151: UserWarning: Field "modelname" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( �[32mINFO�[0m: Started server process [�[36m60872�[0m] �[32mINFO�[0m: Waiting for application startup. �[32mINFO�[0m: Application startup complete. �[32mINFO�[0m: Uvicorn running on �[1mhttp://127.0.0.1:7861�[0m (Press CTRL+C to quit)

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:8501

�[32mINFO�[0m: 127.0.0.1:60893 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:11,185 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( �[32mINFO�[0m: 127.0.0.1:60905 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:17,484 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60906 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:17,716 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60907 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:29,828 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60908 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:35,201 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60909 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:37,622 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:60910 - "�[1mPOST /chat/chat/completions HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:38,084 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:18:38,102 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" �[31mERROR�[0m: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f938b85790

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call | return await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\fastapi\applications.py", line 1054, in call | await super().call(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\applications.py", line 123, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 186, in call | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 164, in call | await self.app(scope, receive, _send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\cors.py", line 83, in call | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 758, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 778, in app | await route.handle(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 299, in handle | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 79, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 77, in app | await response(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call | await wrap(partial(self.listen_for_disconnect, receive)) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\anyio_backends_asyncio.py", line 680, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap | await func() | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\server\api_server\openai_routes.py", line 84, in generator | async for chunk in await method(params): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 147, in aiter | async for item in self._iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 174, in stream | raise APIError( | openai.APIError: An error occurred during streaming +------------------------------------ 2024-07-03 13:18:38.136 Uncaught app exception Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 113, in iter for part in self._httpcore_stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 367, in iter raise exc from None File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 363, in iter for part in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 349, in iter raise exc File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 341, in iter for chunk in self._connection._receive_response_body(kwargs): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 210, in _receive_response_body event = self._receive_event(timeout=timeout) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 221, in _receive_event event = self._h11_state.next_event() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) �[32mINFO�[0m: 127.0.0.1:60914 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:18:48,564 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:61078 - "�[1mGET /tools HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:30:49,320 httpx 47928 INFO HTTP Request: GET http://127.0.0.1:7861/tools "HTTP/1.1 200 OK" �[32mINFO�[0m: 127.0.0.1:61079 - "�[1mPOST /chat/chat/completions HTTP/1.1�[0m" �[32m200 OK�[0m 2024-07-03 13:30:49,777 httpx 47928 INFO HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 200 OK" 2024-07-03 13:30:49,793 httpx 60872 INFO HTTP Request: POST http://127.0.0.1:9997/v1/chat/completions "HTTP/1.1 200 OK" �[31mERROR�[0m: Exception in ASGI application Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call await wrap(partial(self.listen_for_disconnect, receive)) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap await func() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect message = await receive() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive await self.message_event.wait() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\asyncio\locks.py", line 226, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 1f939d08790

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call | return await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\fastapi\applications.py", line 1054, in call | await super().call(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\applications.py", line 123, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 186, in call | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\errors.py", line 164, in call | await self.app(scope, receive, _send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\cors.py", line 83, in call | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 758, in call | await self.middleware_stack(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 778, in app | await route.handle(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 299, in handle | await self.app(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 79, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app | raise exc | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\starlette\routing.py", line 77, in app | await response(scope, receive, send) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 269, in call | await wrap(partial(self.listen_for_disconnect, receive)) | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\anyio_backends_asyncio.py", line 680, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 258, in wrap | await func() | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\sse_starlette\sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\server\api_server\openai_routes.py", line 84, in generator | async for chunk in await method(params): | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 147, in aiter | async for item in self._iterator: | File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 174, in stream | raise APIError( | openai.APIError: An error occurred during streaming +------------------------------------ 2024-07-03 13:30:49.818 Uncaught app exception Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 113, in iter for part in self._httpcore_stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 367, in iter raise exc from None File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\connection_pool.py", line 363, in iter for part in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 349, in iter raise exc File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 341, in iter for chunk in self._connection._receive_response_body(kwargs): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 210, in _receive_response_body event = self._receive_event(timeout=timeout) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_sync\http11.py", line 221, in _receive_event event = self._h11_state.next_event() File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\Lib\site-packages\chatchat\webui.py", line 69, in dialogue_page(api=api, is_lite=is_lite) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\chatchat\webui_pages\dialogue\dialogue.py", line 361, in dialogue_page for d in client.chat.completions.create( File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 46, in iter for item in self._iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 58, in stream for sse in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 50, in _iter_events yield from self._decoder.iter_bytes(self.response.iter_bytes()) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 280, in iter_bytes for chunk in self._iter_chunks(iterator): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\openai_streaming.py", line 291, in _iter_chunks for chunk in iterator: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 829, in iter_bytes for raw_bytes in self.iter_raw(): File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_models.py", line 883, in iter_raw for raw_stream_bytes in self.stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_client.py", line 126, in iter for chunk in self._stream: File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 114, in iter yield part File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\contextlib.py", line 137, in exit self.gen.throw(typ, value, traceback) File "C:\ProgramData\Anaconda3\envs\llama-cpp-env\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

楼主请问解决了吗

liunux4odoo commented 1 week ago

0.3.1 版已经发布,优化了配置方式,修改配置项无需重启服务器,可以更新尝试。

seallijian commented 1 week ago

升级到0.3.1版,发现对于普通文字问答,没问题,但对于有图生文,发现还是无法正常使用,比如我用的InternVL-Chat的模型,,问题都集中在 dialogue的client.chat.completions.create,无论流式输出是true或false,都会提示500错误,建议群主如果确定能支持图生文功能的话,能针对图生文的用法和注意事项给一个详细的解释。

Modas-Li commented 1 week ago

解决方案:pip install 'transformers==4.41.2'