THUDM / ChatGLM3

ChatGLM3 series: Open Bilingual Chat LLMs | 开源双语对话语言模型
Apache License 2.0
13.05k stars 1.51k forks source link

chatglm3_6b模型,stream_chat推理报错。需求是生成Java代码。同样的prompt,推理chatglm3_6b_32k是正常的。 #261

Closed weiwei0519 closed 7 months ago

weiwei0519 commented 8 months ago

System Info / 系統信息

Traceback (most recent call last): File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 254, in run_asgi result = await self.app(self.scope, self.asgi_receive, self.asgi_send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in call return await self.app(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/fastapi/applications.py", line 276, in call await super().call(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/middleware/errors.py", line 149, in call await self.app(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/middleware/cors.py", line 76, in call await self.app(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in call raise exc File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in call await self.app(scope, receive, sender) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in call raise e File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in call await self.app(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/routing.py", line 718, in call await route.handle(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/routing.py", line 341, in handle await self.app(scope, receive, send) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/starlette/routing.py", line 82, in app await func(session) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/fastapi/routing.py", line 289, in app await dependant.call(values) File "/home/python/AI_project/gpt-coding/copilot_web_api.py", line 474, in assisstant_chat for generated_text, history in agent_tools.execute_task(task=task, history_task=history): File "/home/python/AI_project/gpt-coding/agent/tools_agent.py", line 59, in execute_task for response, history_task in self.llm.call_llm(prompt=task['prompt'], File "/home/python/AI_project/gpt-coding/model/llm_hub.py", line 189, in call_llm for response, history, past_key_values in self.model.stream_chat(tokenizer=self.tokenizer, File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/root/.cache/huggingface/modules/transformers_modules/chatglm3-6b/modeling_chatglm.py", line 1077, in stream_chat response, new_history = self.process_response(response, history) File "/root/.cache/huggingface/modules/transformers_modules/chatglm3-6b/modeling_chatglm.py", line 1003, in process_response metadata, content = response.split("\n", maxsplit=1) ValueError: not enough values to unpack (expected 2, got 1)** ERROR: closing handshake failed Traceback (most recent call last): File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/server.py", line 248, in handler await self.close() File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/protocol.py", line 766, in close await self.write_close_frame(Close(code, reason)) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/protocol.py", line 1232, in write_close_frame await self.write_frame(True, OP_CLOSE, data, _state=State.CLOSING) File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/protocol.py", line 1205, in write_frame await self.drain() File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/protocol.py", line 1194, in drain await self.ensure_open() File "/usr/local/python/anaconda3/envs/chatgpt_coding/lib/python3.9/site-packages/websockets/legacy/protocol.py", line 935, in ensure_open raise self.connection_closed_exc() websockets.exceptions.ConnectionClosedError: sent 1000 (OK); no close frame received

Who can help? / 谁可以帮助到您?

No response

Information / 问题信息

Reproduction / 复现过程

提问的Prompt是:生成java代码,接入阿里云RocketMQ,实现队列推送,队列消费,队列任务监控等功能操作。

Expected behavior / 期待表现

希望提供指导和修复建议。是代码问题,还是环境问题。

zRzRzRzRzRzRzR commented 8 months ago

看到你的代码中与agent相关,chatglm3-6b不支持agent状态下的stream chat 另外 是在官方代码报错,还是自己写的api请求报错