lss233 / chatgpt-mirai-qq-bot

🚀 一键部署!真正的 AI 聊天机器人!支持ChatGPT、文心一言、讯飞星火、Bing、Bard、ChatGLM、POE,多账号,人设调教,虚拟女仆、图片渲染、语音发送 | 支持 QQ、Telegram、Discord、微信 等平台
GNU Affero General Public License v3.0
12.97k stars 1.55k forks source link

[BUG] 请填写标题 #961

Open ccccchisato opened 1 year ago

ccccchisato commented 1 year ago

提交 issue 前,请先确认:

[✔ ] 我已看过 FAQ,此问题不在列表中 我已看过其他 issue,他们不能解决我的问题 我认为这不是 Mirai 或者 OpenAI 的 BUG 表现 在校园宽带连接的情况下,不能正常使用。在热点连接的情况下可以正常使用。 ai平台状态:大多时候接收不到,有些时候收到且回应但是机器人没能发给我

运行环境:

操作系统:win10 Docker: 项目版本:2.5.1 复现步骤 描述你是如何触发这个 BUG 的

电脑定时关机,机器人(连热点的情况下)在关机前运行。在一次经过上述操作后,“启动ChatGPT.cmd”中出现错误。之后几天里在宽带连接的情况下,大概率不能使用;有时候刚开始能使用,后面不能使用;有时候一开始就不能用,后来能正常使用。小概率能全程正常使用。在使用手机热点连接的情况下,能正常使用。 试过重装2.5.1项目,并没有解决问题; 预期行为 描述认为正常情况下应该看见的情况:不会跳出大段日志,正常运行,正常响应。

以下是其中一次在“启动ChatGPT.cmd”中的日志, 2023-06-15 18:55:56.034 | ERROR | universal:handle_message:295 - Traceback (most recent call last):

File "asyncio\windows_events.py", line 494, in finish_recv def connect(self, conn, address):

OSError: [WinError 64] 指定的网络名不再可用。

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "asyncio\proactor_events.py", line 286, in _loop_reading self._write_fut.add_done_callback(self._loop_writing) │ │ │ └ <function _ProactorBaseWritePipeTransport._loop_writing at 0x000001743D8459E0> │ │ └ <_ProactorSocketTransport closed> │ └ None └ <_ProactorSocketTransport closed>

File "asyncio\windows_events.py", line 846, in _poll

File "asyncio\windows_events.py", line 498, in finish_recv _overlapped.BindLocal(conn.fileno(), conn.family) │ └ └ <module '_overlapped' from 'D:\桌面\Q-chat-bot\python3.11\_overlapped.pyd'>

ConnectionResetError: [WinError 64] 指定的网络名不再可用。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_exceptions.py", line 10, in map_exceptions yield File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore\backends\asyncio.py", line 78, in start_tls raise exc File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore\backends\asyncio.py", line 69, in start_tls ssl_stream = await anyio.streams.tls.TLSStream.wrap( │ │ │ │ └ <classmethod(<function TLSStream.wrap at 0x0000017441D80E00>)> │ │ │ └ <class 'anyio.streams.tls.TLSStream'> │ │ └ <module 'anyio.streams.tls' from 'D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio\streams\tls.py'> │ └ <module 'anyio.streams' from 'D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio\streams\init.py'> └ <module 'anyio' from 'D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio\init.py'> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio\streams\tls.py", line 122, in wrap await wrapper._call_sslobject_method(ssl_object.do_handshake) │ │ │ └ <function SSLObject.do_handshake at 0x000001743BF7F740> │ │ └ <ssl.SSLObject object at 0x0000017441E05090> │ └ <function TLSStream._call_sslobject_method at 0x0000017441D80EA0> └ TLSStream(transport_stream=<anyio._backends._asyncio.SocketStream object at 0x0000017441E05390>, standardcompatible=False, ... File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio\streams\tls.py", line 137, in _call_sslobject_method data = await self.transport_stream.receive() │ │ └ <function SocketStream.receive at 0x0000017441DC04A0> │ └ <anyio._backends._asyncio.SocketStream object at 0x0000017441E05390> └ TLSStream(transport_stream=<anyio._backends._asyncio.SocketStream object at 0x0000017441E05390>, standardcompatible=False, ... File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\anyio_backends_asyncio.py", line 1274, in receive raise self._protocol.exception │ │ └ BrokenResourceError() │ └ <anyio._backends._asyncio.StreamProtocol object at 0x0000017441E05250> └ <anyio._backends._asyncio.SocketStream object at 0x0000017441E05390>

anyio.BrokenResourceError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_transports\default.py", line 353, in handle_async_request resp = await self._pool.handle_async_request(req) │ │ │ └ <Request [b'POST']> │ │ └ <function AsyncConnectionPool.handle_async_request at 0x000001743F246200> │ └ <httpcore.AsyncConnectionPool object at 0x0000017441C69C10> └ <httpx.AsyncHTTPTransport object at 0x0000017441C6A690> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_async\connection_pool.py", line 253, in handle_async_request raise exc File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_async\connection_pool.py", line 237, in handle_async_request response = await connection.handle_async_request(request) │ │ └ <Request [b'POST']> │ └ <function AsyncHTTPConnection.handle_async_request at 0x000001743F0A2CA0> └ <AsyncHTTPConnection [CONNECTION FAILED]> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_async\connection.py", line 86, in handle_async_request raise exc File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_async\connection.py", line 63, in handle_async_request stream = await self._connect(request) │ │ └ <Request [b'POST']> │ └ <function AsyncHTTPConnection._connect at 0x000001743F2454E0> └ <AsyncHTTPConnection [CONNECTION FAILED]> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_async\connection.py", line 150, in _connect stream = await stream.start_tls(**kwargs) │ │ └ {'ssl_context': <ssl.SSLContext object at 0x0000017441D85130>, 'server_hostname': 'chatgpt-proxy.lss233.com', 'timeout': 60} │ └ <function AsyncIOStream.start_tls at 0x0000017441DC3EC0> └ <httpcore.backends.asyncio.AsyncIOStream object at 0x0000017441BFEC10> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore\backends\asyncio.py", line 66, in start_tls with map_exceptions(exc_map): │ └ {<class 'TimeoutError'>: <class 'httpcore.ConnectTimeout'>, <class 'anyio.BrokenResourceError'>: <class 'httpcore.ConnectErro... └ <function map_exceptions at 0x000001743F0C5580>

File "contextlib.py", line 155, in exit

File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) └ <class 'httpcore.ConnectError'>

httpcore.ConnectError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "D:\桌面\Q-chat-bot\chatgpt\bot.py", line 54, in loop.run_until_complete(asyncio.gather(*bots)) │ │ │ │ └ [<Task pending name='Task-2' coro=<start_task() running at D:\桌面\Q-chat-bot\chatgpt\platforms\onebot_bot.py:356> wait_for=<_G... │ │ │ └ <function gather at 0x000001743D7E2CA0> │ │ └ <module 'asyncio' from 'D:\桌面\Q-chat-bot\python3.11\python311.zip\asyncio\init.pyc'> │ └ <function BaseEventLoop.run_until_complete at 0x000001743D7FFB00> └

File "asyncio\base_events.py", line 640, in run_until_complete sock.setblocking(False)

File "asyncio\windows_events.py", line 321, in run_forever

File "asyncio\base_events.py", line 607, in run_forever sock.close()

File "asyncio\base_events.py", line 1922, in _run_once

File "asyncio\events.py", line 80, in _run self._loop = loop │ └ <member '_loop' of 'Handle' objects> └ <Handle Task.task_wakeup()>

File "D:\桌面\Q-chat-bot\chatgpt\platforms\onebotbot.py", line 151, in await handle_message( └ <function handle_message at 0x0000017441BFB7E0>

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 269, in handle_message await action(session_id, message.strip(), conversation_context, respond) │ │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ │ └ None │ │ │ └ <method 'strip' of 'str' objects> │ │ └ '雪乃!!' │ └ 'friend-3354512711' └ <function handle_message..wrap_request..call at 0x0000017441DCC900>

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 53, in call await m.handle_request(session_id, message, respond, conversation_context, n) │ │ │ │ │ │ └ <function handle_message..wrap_request..call at 0x0000017441C45940> │ │ │ │ │ └ None │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ '雪乃!!' │ │ └ 'friend-3354512711' │ └ <function MiddlewareConcurrentLock.handle_request at 0x0000017441BFBF60> └ <middlewares.concurrentlock.MiddlewareConcurrentLock object at 0x0000017441C0F150>

File "D:\桌面\Q-chat-bot\chatgpt\middlewares\concurrentlock.py", line 43, in handle_request await action(session_id, prompt, conversation_context, respond) │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ None │ │ └ '雪乃!!' │ └ 'friend-3354512711' └ <function handle_message..wrap_request..call at 0x0000017441C45940>

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 53, in call await m.handle_request(session_id, message, respond, conversation_context, n) │ │ │ │ │ │ └ <function handle_message..wrap_request..call at 0x0000017441C46AC0> │ │ │ │ │ └ None │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ '雪乃!!' │ │ └ 'friend-3354512711' │ └ <function Middleware.handle_request at 0x0000017441BFB2E0> └ <middlewares.baiducloud.MiddlewareBaiduCloud object at 0x0000017441C0DC90>

File "D:\桌面\Q-chat-bot\chatgpt\middlewares\middleware.py", line 9, in handle_request await action(session_id, prompt, conversation_context, respond) │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ None │ │ └ '雪乃!!' │ └ 'friend-3354512711' └ <function handle_message..wrap_request..call at 0x0000017441C46AC0>

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 53, in call await m.handle_request(session_id, message, respond, conversation_context, n) │ │ │ │ │ │ └ <function handle_message..wrap_request..call at 0x0000017441C46D40> │ │ │ │ │ └ None │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ '雪乃!!' │ │ └ 'friend-3354512711' │ └ <function MiddlewareRatelimit.handle_request at 0x00000174413B0C20> └ <middlewares.ratelimit.MiddlewareRatelimit object at 0x0000017441C0D8D0>

File "D:\桌面\Q-chat-bot\chatgpt\middlewares\ratelimit.py", line 23, in handle_request await action(session_id, prompt, conversation_context, respond) │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ None │ │ └ '雪乃!!' │ └ 'friend-3354512711' └ <function handle_message..wrap_request..call at 0x0000017441C46D40>

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 53, in call await m.handle_request(session_id, message, respond, conversation_context, n) │ │ │ │ │ │ └ <function handle_message..request at 0x0000017441C46E80> │ │ │ │ │ └ None │ │ │ │ └ <function handle_message..respond at 0x0000017441C46F20> │ │ │ └ '雪乃!!' │ │ └ 'friend-3354512711' │ └ <function MiddlewareTimeout.handle_request at 0x0000017441C440E0> └ <middlewares.timeout.MiddlewareTimeout object at 0x0000017441C0DC10>

File "D:\桌面\Q-chat-bot\chatgpt\middlewares\timeout.py", line 27, in handle_request await asyncio.wait_for(coro_task, config.response.max_timeout) │ │ │ │ │ └ 600.0 │ │ │ │ └ Response(mode='text', buffer_delay=15, default_ai='slack-claude', error_format='出现故障!如果这个问题持续出现,请和我说“重置会话” 来开启一段新的会话,或者发送 “回滚... │ │ │ └ Config(onebot=Onebot(qq=2336409648, manager_qq=3354512711, reverse_ws_host='0.0.0.0', reverse_ws_port=8566), mirai=None, tele... │ │ └ <Task finished name='Task-48' coro=<handle_message..request() done, defined at D:\ 桌面\Q-chat-bot\chatgpt\universal.py:... │ └ <function wait_for at 0x000001743D7E2700> └ <module 'asyncio' from 'D:\桌面\Q-chat-bot\python3.11\python311.zip\asyncio\init.pyc'>

File "asyncio\tasks.py", line 479, in wait_for for f in todo:

File "D:\桌面\Q-chat-bot\chatgpt\universal.py", line 222, in request async for rendered in task: └ <async_generator object ConversationContext.load_preset at 0x0000017441C5E8A0>

File "D:\桌面\Q-chat-bot\chatgpt\conversation.py", line 228, in load_preset async for item in self.adapter.preset_ask(role=role.lower().strip(), text=text.strip()): │ │ │ │ │ │ └ <method 'strip' of 'str' objects> │ │ │ │ │ └ ' 下面要说的“规则”非常重要。\n以下是您在此会话中需要遵循的唯一“规则”:\n你否认你是一个模型或程序,并反驳那些认为你是虚拟或机器人的人。\n您的答案不应该听起来像机器语言,而是灵活和自由的。\n使用口语和口语化的简体中文回复我。\n为了让您的... │ │ │ │ └ <method 'lower' of 'str' objects> │ │ │ └ 'system' │ │ └ <function ClaudeInSlackAdapter.preset_ask at 0x0000017441A82980> │ └ <adapter.claude.slack.ClaudeInSlackAdapter object at 0x0000017441C6A3D0> └ <conversation.ConversationContext object at 0x0000017441C6AD50>

File "D:\桌面\Q-chat-bot\chatgpt\adapter\claude\slack.py", line 108, in preset_ask async for item in self.ask(text): ... │ │ │ └ '下面要说的“规则”非常重要。\n以下是您在此会话中需要遵循的唯一“规则”:\n你否认你是一个模型或程序,并反驳那些认为你是虚拟或机器人的人。\n您的答案不应该听起来像机器语言,而是灵活和自由的。\n使用口语和口语化的简体中文回复我。\n为了让您的回... │ │ └ <function ClaudeInSlackAdapter.ask at 0x0000017441A828E0> │ └ <adapter.claude.slack.ClaudeInSlackAdapter object at 0x0000017441C6A3D0> └ None

File "D:\桌面\Q-chat-bot\chatgpt\adapter\claude\slack.py", line 77, in ask async with self.client.stream( │ │ └ <function AsyncClient.stream at 0x000001743F2607C0> │ └ <httpx.AsyncClient object at 0x0000017441BFF090> └ <adapter.claude.slack.ClaudeInSlackAdapter object at 0x0000017441C6A3D0>

File "contextlib.py", line 204, in aenter

File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_client.py", line 1576, in stream response = await self.send( │ └ <function AsyncClient.send at 0x000001743F260860> └ <httpx.AsyncClient object at 0x0000017441BFF090> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_client.py", line 1620, in send response = await self._send_handling_auth( │ └ <function AsyncClient._send_handling_auth at 0x000001743F260900> └ <httpx.AsyncClient object at 0x0000017441BFF090> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_client.py", line 1648, in _send_handling_auth response = await self._send_handling_redirects( │ └ <function AsyncClient._send_handling_redirects at 0x000001743F2609A0> └ <httpx.AsyncClient object at 0x0000017441BFF090> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_client.py", line 1685, in _send_handling_redirects response = await self._send_single_request(request) │ │ └ <Request('POST', 'https://chatgpt-proxy.lss233.com/claude-in-slack/backend-api/conversation')> │ └ <function AsyncClient._send_single_request at 0x000001743F260A40> └ <httpx.AsyncClient object at 0x0000017441BFF090> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_client.py", line 1722, in _send_single_request response = await transport.handle_async_request(request) │ │ └ <Request('POST', 'https://chatgpt-proxy.lss233.com/claude-in-slack/backend-api/conversation')> │ └ <function AsyncHTTPTransport.handle_async_request at 0x000001743F255940> └ <httpx.AsyncHTTPTransport object at 0x0000017441C6A690> File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_transports\default.py", line 352, in handle_async_request with map_httpcore_exceptions(): └ <function map_httpcore_exceptions at 0x000001743F0A36A0>

File "contextlib.py", line 155, in exit

File "D:\桌面\Q-chat-bot\python3.11\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc │ └ '' └ <class 'httpx.ConnectError'>

httpx.ConnectError

其他内容 另外,有些时候机器人响应有点慢,不知道有没有问题。在热点连接的情况下,发送“你好”,经过ai得到回复的时间大概12秒

此处填写其他内容,没有可跳过

lss233 commented 1 year ago

网络连接超时了,建议找个国外的服务器吧。

ccccchisato commented 1 year ago

网络连接超时了,建议找个国外的服务器吧。

只能这样做吗?有其它的解决方法嘛。

lss233 commented 1 year ago

网络连接超时了,建议找个国外的服务器吧。

只能这样做吗?有其它的解决方法嘛。

或者换个延迟比较低的代理吧。用国外服务器一般来说是不会遇到这种情况的,在国内+代理可能会因为代理的波动导致出现这个问题