lss233 / chatgpt-mirai-qq-bot

🚀 一键部署!真正的 AI 聊天机器人!支持ChatGPT、文心一言、讯飞星火、Bing、Bard、ChatGLM、POE,多账号,人设调教,虚拟女仆、图片渲染、语音发送 | 支持 QQ、Telegram、Discord、微信 等平台
GNU Affero General Public License v3.0
13.34k stars 1.56k forks source link

[BUG] 卡在初始化处理中.. 然后报错 #136

Closed 501658362 closed 1 year ago

501658362 commented 1 year ago

提交 issue 前,请先确认:

表现
描述 BUG 的表现情况

运行环境:

复现步骤
描述你是如何触发这个 BUG 的

  1. 写好配置文件,启动成功
  2. @机器人就报错了

预期行为
描述你认为正常情况下应该看见的情况

截图
相关日志、聊天记录的截图,没有可跳过

chatgpt_1  | 2023-02-15 06:51:08.402 | DEBUG    | chatbot:initial_process:110 - 初始化处理中...
chatgpt_1  | 2023-02-15 06:52:11.090 | ERROR    | __main__:handle_message:104 - ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
chatgpt_1  | Traceback (most recent call last):
chatgpt_1  |
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 703, in urlopen
chatgpt_1  |     httplib_response = self._make_request(
chatgpt_1  |                        │    └ <function HTTPConnectionPool._make_request at 0x7f7df89c7a60>
chatgpt_1  |                        └ <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f7df44c9cd0>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 449, in _make_request
chatgpt_1  |     six.raise_from(e, None)
chatgpt_1  |     │   └ <function raise_from at 0x7f7df8a53e50>
chatgpt_1  |     └ <module 'urllib3.packages.six' from '/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py'>
chatgpt_1  |   File "<string>", line 3, in raise_from
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 444, in _make_request
chatgpt_1  |     httplib_response = conn.getresponse()
chatgpt_1  |                        │    └ <function HTTPConnection.getresponse at 0x7f7df9c60820>
chatgpt_1  |                        └ <urllib3.connection.HTTPSConnection object at 0x7f7df44c9760>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 1377, in getresponse
chatgpt_1  |     response.begin()
chatgpt_1  |     │        └ <function HTTPResponse.begin at 0x7f7df9c5db80>
chatgpt_1  |     └ <http.client.HTTPResponse object at 0x7f7df44c9e20>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 320, in begin
chatgpt_1  |     version, status, reason = self._read_status()
chatgpt_1  |                               │    └ <function HTTPResponse._read_status at 0x7f7df9c5daf0>
chatgpt_1  |                               └ <http.client.HTTPResponse object at 0x7f7df44c9e20>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 289, in _read_status
chatgpt_1  |     raise RemoteDisconnected("Remote end closed connection without"
chatgpt_1  |           └ <class 'http.client.RemoteDisconnected'>
chatgpt_1  |
chatgpt_1  | http.client.RemoteDisconnected: Remote end closed connection without response
chatgpt_1  |
chatgpt_1  |
chatgpt_1  | During handling of the above exception, another exception occurred:
chatgpt_1  |
chatgpt_1  |
chatgpt_1  | Traceback (most recent call last):
chatgpt_1  |
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 489, in send
chatgpt_1  |     resp = conn.urlopen(
chatgpt_1  |            │    └ <function HTTPConnectionPool.urlopen at 0x7f7df89c7ca0>
chatgpt_1  |            └ <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f7df44c9cd0>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 787, in urlopen
chatgpt_1  |     retries = retries.increment(
chatgpt_1  |               │       └ <function Retry.increment at 0x7f7df8a02c10>
chatgpt_1  |               └ Retry(total=0, connect=None, read=False, redirect=None, status=None)
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 550, in increment
chatgpt_1  |     raise six.reraise(type(error), error, _stacktrace)
chatgpt_1  |           │   │            │       │      └ <traceback object at 0x7f7df4122980>
chatgpt_1  |           │   │            │       └ ProtocolError('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
chatgpt_1  |           │   │            └ ProtocolError('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
chatgpt_1  |           │   └ <function reraise at 0x7f7df8a53dc0>
chatgpt_1  |           └ <module 'urllib3.packages.six' from '/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py'>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 769, in reraise
chatgpt_1  |     raise value.with_traceback(tb)
chatgpt_1  |           │                    └ None
chatgpt_1  |           └ None
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 703, in urlopen
chatgpt_1  |     httplib_response = self._make_request(
chatgpt_1  |                        │    └ <function HTTPConnectionPool._make_request at 0x7f7df89c7a60>
chatgpt_1  |                        └ <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f7df44c9cd0>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 449, in _make_request
chatgpt_1  |     six.raise_from(e, None)
chatgpt_1  |     │   └ <function raise_from at 0x7f7df8a53e50>
chatgpt_1  |     └ <module 'urllib3.packages.six' from '/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py'>
chatgpt_1  |   File "<string>", line 3, in raise_from
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 444, in _make_request
chatgpt_1  |     httplib_response = conn.getresponse()
chatgpt_1  |                        │    └ <function HTTPConnection.getresponse at 0x7f7df9c60820>
chatgpt_1  |                        └ <urllib3.connection.HTTPSConnection object at 0x7f7df44c9760>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 1377, in getresponse
chatgpt_1  |     response.begin()
chatgpt_1  |     │        └ <function HTTPResponse.begin at 0x7f7df9c5db80>
chatgpt_1  |     └ <http.client.HTTPResponse object at 0x7f7df44c9e20>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 320, in begin
chatgpt_1  |     version, status, reason = self._read_status()
chatgpt_1  |                               │    └ <function HTTPResponse._read_status at 0x7f7df9c5daf0>
chatgpt_1  |                               └ <http.client.HTTPResponse object at 0x7f7df44c9e20>
chatgpt_1  |   File "/usr/local/lib/python3.9/http/client.py", line 289, in _read_status
chatgpt_1  |     raise RemoteDisconnected("Remote end closed connection without"
chatgpt_1  |           └ <class 'http.client.RemoteDisconnected'>
chatgpt_1  |
chatgpt_1  | urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
chatgpt_1  |
chatgpt_1  |
chatgpt_1  | During handling of the above exception, another exception occurred:
chatgpt_1  |
chatgpt_1  |
chatgpt_1  | Traceback (most recent call last):
chatgpt_1  |
chatgpt_1  |   File "/app/bot.py", line 152, in <module>
chatgpt_1  |     app.launch_blocking()
chatgpt_1  |     │   └ <classmethod object at 0x7f7df968aee0>
chatgpt_1  |     └ <graia.ariadne.app.Ariadne object at 0x7f7dfb3f56a0>
chatgpt_1  |
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/graia/ariadne/app.py", line 313, in launch_blocking
chatgpt_1  |     cls.launch_manager.launch_blocking(loop=cls.service.loop, stop_signal=stop_signals)
chatgpt_1  |     │   │              │                    │   │       │                 └ (<Signals.SIGINT: 2>,)
chatgpt_1  |     │   │              │                    │   │       └ <property object at 0x7f7df9692b80>
chatgpt_1  |     │   │              │                    │   └ <graia.ariadne.service.ElizabethService object at 0x7f7dfb3f5a30>
chatgpt_1  |     │   │              │                    └ <class 'graia.ariadne.app.Ariadne'>
chatgpt_1  |     │   │              └ <function Launart.launch_blocking at 0x7f7dfa818b80>
chatgpt_1  |     │   └ <launart.manager.Launart object at 0x7f7df8a743d0>
chatgpt_1  |     └ <class 'graia.ariadne.app.Ariadne'>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/launart/manager.py", line 494, in launch_blocking
chatgpt_1  |     loop.run_until_complete(launch_task)
chatgpt_1  |     │    │                  └ <Task pending name='amnesia-launch' coro=<Launart.launch() running at /usr/local/lib/python3.9/site-packages/launart/manager....
chatgpt_1  |     │    └ <function BaseEventLoop.run_until_complete at 0x7f7dfab4dc10>
chatgpt_1  |     └ <_UnixSelectorEventLoop running=True closed=False debug=False>
chatgpt_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 634, in run_until_complete
chatgpt_1  |     self.run_forever()
chatgpt_1  |     │    └ <function BaseEventLoop.run_forever at 0x7f7dfab4db80>
chatgpt_1  |     └ <_UnixSelectorEventLoop running=True closed=False debug=False>
chatgpt_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 601, in run_forever
chatgpt_1  |     self._run_once()
chatgpt_1  |     │    └ <function BaseEventLoop._run_once at 0x7f7dfab50700>
chatgpt_1  |     └ <_UnixSelectorEventLoop running=True closed=False debug=False>
chatgpt_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1905, in _run_once
chatgpt_1  |     handle._run()
chatgpt_1  |     │      └ <function Handle._run at 0x7f7dfac714c0>
chatgpt_1  |     └ <Handle <TaskWakeupMethWrapper object at 0x7f7df44c9d60>(<Future finis...7f7df4124ba0>>)>
chatgpt_1  |   File "/usr/local/lib/python3.9/asyncio/events.py", line 80, in _run
chatgpt_1  |     self._context.run(self._callback, *self._args)
chatgpt_1  |     │    │            │    │           │    └ <member '_args' of 'Handle' objects>
chatgpt_1  |     │    │            │    │           └ <Handle <TaskWakeupMethWrapper object at 0x7f7df44c9d60>(<Future finis...7f7df4124ba0>>)>
chatgpt_1  |     │    │            │    └ <member '_callback' of 'Handle' objects>
chatgpt_1  |     │    │            └ <Handle <TaskWakeupMethWrapper object at 0x7f7df44c9d60>(<Future finis...7f7df4124ba0>>)>
chatgpt_1  |     │    └ <member '_context' of 'Handle' objects>
chatgpt_1  |     └ <Handle <TaskWakeupMethWrapper object at 0x7f7df44c9d60>(<Future finis...7f7df4124ba0>>)>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/graia/broadcast/__init__.py", line 191, in Executor
chatgpt_1  |     result = await run_always_await(target_callable, **parameter_compile_result)
chatgpt_1  |                    │                │                  └ {'group': Group(id=151059228, name='陈彦瑾、”吃瓜群众', account_perm=<普通成员>), 'source': Source(id=314, time=datetime.datetime(2023, 2...
chatgpt_1  |                    │                └ <function group_message_listener at 0x7f7df8a845e0>
chatgpt_1  |                    └ <function run_always_await at 0x7f7dfa7cdca0>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/graia/broadcast/utilles.py", line 34, in run_always_await
chatgpt_1  |     obj = await obj
chatgpt_1  |                 └ <coroutine object group_message_listener at 0x7f7df412a2c0>
chatgpt_1  |
chatgpt_1  |   File "/app/bot.py", line 124, in group_message_listener
chatgpt_1  |     response = await handle_message(group, f"group-{group.id}", chain.display, source)
chatgpt_1  |                      │              │                           │     │        └ Source(id=314, time=datetime.datetime(2023, 2, 15, 6, 51, 8, tzinfo=datetime.timezone.utc), type='Source')
chatgpt_1  |                      │              │                           │     └ <property object at 0x7f7df978cd60>
chatgpt_1  |                      │              │                           └ MessageChain([Plain(text='好')])
chatgpt_1  |                      │              └ Group(id=151059228, name='陈彦瑾、”吃瓜群众', account_perm=<普通成员>)
chatgpt_1  |                      └ <function handle_message at 0x7f7df8a84430>
chatgpt_1  |
chatgpt_1  | > File "/app/bot.py", line 97, in handle_message
chatgpt_1  |     resp = await session.get_chat_response(message)
chatgpt_1  |                  │       │                 └ '好'
chatgpt_1  |                  │       └ <function ChatSession.get_chat_response at 0x7f7df95c2310>
chatgpt_1  |                  └ <chatbot.ChatSession object at 0x7f7df44c98b0>
chatgpt_1  |
chatgpt_1  |   File "/app/chatbot.py", line 87, in get_chat_response
chatgpt_1  |     for item in resp:
chatgpt_1  |                 └ <generator object Chatbot.ask at 0x7f7df4124ba0>
chatgpt_1  |
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/revChatGPT/V1.py", line 145, in ask
chatgpt_1  |     response = self.session.post(
chatgpt_1  |                │    │       └ <function Session.post at 0x7f7df88e4790>
chatgpt_1  |                │    └ <requests.sessions.Session object at 0x7f7df8a26100>
chatgpt_1  |                └ <revChatGPT.V1.Chatbot object at 0x7f7df8a260d0>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 635, in post
chatgpt_1  |     return self.request("POST", url, data=data, json=json, **kwargs)
chatgpt_1  |            │    │               │         │          │       └ {'timeout': 360, 'stream': True}
chatgpt_1  |            │    │               │         │          └ None
chatgpt_1  |            │    │               │         └ '{"action": "next", "messages": [{"id": "2bf33c8c-7e8b-4e8f-b3aa-3301de3fd97a", "role": "user", "content": {"content_type": "...
chatgpt_1  |            │    │               └ 'https://chatgpt-proxy.fly.dev/api/conversation'
chatgpt_1  |            │    └ <function Session.request at 0x7f7df88e4550>
chatgpt_1  |            └ <requests.sessions.Session object at 0x7f7df8a26100>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 587, in request
chatgpt_1  |     resp = self.send(prep, **send_kwargs)
chatgpt_1  |            │    │    │       └ {'timeout': 360, 'allow_redirects': True, 'proxies': OrderedDict(), 'stream': True, 'verify': True, 'cert': None}
chatgpt_1  |            │    │    └ <PreparedRequest [POST]>
chatgpt_1  |            │    └ <function Session.send at 0x7f7df88e49d0>
chatgpt_1  |            └ <requests.sessions.Session object at 0x7f7df8a26100>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 701, in send
chatgpt_1  |     r = adapter.send(request, **kwargs)
chatgpt_1  |         │       │    │          └ {'timeout': 360, 'proxies': OrderedDict(), 'stream': True, 'verify': True, 'cert': None}
chatgpt_1  |         │       │    └ <PreparedRequest [POST]>
chatgpt_1  |         │       └ <function HTTPAdapter.send at 0x7f7df88e9e50>
chatgpt_1  |         └ <requests.adapters.HTTPAdapter object at 0x7f7df892ba90>
chatgpt_1  |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 547, in send
chatgpt_1  |     raise ConnectionError(err, request=request)
chatgpt_1  |           │                            └ <PreparedRequest [POST]>
chatgpt_1  |           └ <class 'requests.exceptions.ConnectionError'>
chatgpt_1  |
chatgpt_1  | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
mirai_1    | 2023-02-15 06:52:12 V/Bot.269920271: Group(151059228) <- 出现故障!如果这个问题持续出现,请和我说“重置会话” 来开启一段新的会话,或者发送 “回滚对话” 来回溯到上一条对话,你上一条说的我就当作没看见。\n('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
chatgpt_1  | 2023-02-15 06:52:12.382 | INFO     | graia.ariadne.model:log:83 - 269920271: [SEND][陈彦瑾、”吃瓜群众(151059228)] <- 出现故障!如果这个问题持续出现,请和我说“重置会话” 来开启一段新的会话,或者发送 “回滚对话” 来回溯到上一条对话,你上一条说的我就当作没看见。\n('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

其他内容
此处填写其他内容,没有可跳过

behinder85 commented 1 year ago

从上源看到的 [错误]:请求.异常.连接错误:(“连接中止”,远程断开连接(“远程端关闭连接而没有响应”)) 将\chatgpt\python3.9\Lib\site-packages\revChatGPT中的V1.py中的第19行修改为https://chatgpt-proxy2.fly.dev/ 目前两个都无法正常工作

lss233 commented 1 year ago

你有给它设置代理吗?

501658362 commented 1 year ago

你有给它设置代理吗? 这是我的配置 帮忙看看


[mirai]
qq = xxxx

以下设置如果不了解,可以不用理会

api_key = "1234567890" http_url = "http://mirai:8080" ws_url = "http://mirai:8080"

[openai]

OpenAI 相关设置

模式选择, browser - 浏览器登录, proxy = 第三方代理登录

如果你使用浏览器登录卡在 Found session token,就使用第三方代理登录

172.16.100.230

proxy = "http://172.16.100.230:9922"

mode = 'proxy'

你的 OpenAI 邮箱

email = "xxxx"

你的 OpenAI 密码

password = "xxxx"

是否使用正向代理(与上方的第三方代理登录是两个不同的东西,但使用第三方代理登录时建议设置这一项)

proxy="http://127.0.0.1:1080"

是否为 ChatGPT Plus 用户(是的话设置为 true)

paid = false

[system]

是否自动同意进群邀请

accept_group_invite = false

是否自动同意好友请求

accept_friend_request = false

lss233 commented 1 year ago

# proxy = "http://172.16.100.230:9922"

这行被你注释了,没有走代理,所以出现了这个报错。

lss233 commented 1 year ago

v1.5.4.4 发布了,试试能不能解决你的问题。