GavinGoo / pandora-web

复活原潘多拉,使其作为能使用OAI前端代理+自定义API的Web面板
GNU General Public License v2.0
172 stars 60 forks source link

Old Chat 模式下使用 API 报错 #7

Closed PlayMcBKuwu closed 3 months ago

PlayMcBKuwu commented 3 months ago
(base) C:\Users\PlayMcBKuwu\Desktop\pandora-web-dev>pandora -s -l --site_password qwq123 --old_chat -v

            Pandora - A command-line interface to ChatGPT
            Original Github: https://github.com/zhile-io/pandora
            Original author: https://github.com/wozulong
            Secondary dev: https://github.com/GavinGoo/pandora-web/tree/dev
            Get access token: None
            Version: 1.3.2, Mode: server, Engine: free
            Support OAI: False

Your Arguments:
email: None
password: None
mfa: None
proxy_api: None
login_url: None
site_password: qwq123
proxy: None
gpt4: None
gpt35: None
history_count: 4
best_history: False
true_del: False
local: True
timeout: 60
oai_only: False
token_file: None
tokens_file: None
config_dir: C:\Users\PlayMcBKuwu\AppData\Local\Pandora-ChatGPT\Pandora-ChatGPT
server: 0.0.0.0:8008
threads: 8
api: False
login_local: False
verbose: True
old_login: False
old_chat: True
file_size: None
type_whitelist: None
type_blacklist: None
file_access: False
device_id: None
debug: False

2024-04-04 16:56:15.670 | INFO     | logging:callHandlers:1706 - Serving on http://0.0.0.0:8008
2024-04-04 16:56:17.925 | ERROR    | logging:callHandlers:1706 - Exception on /api/conversation/talk [POST]
Traceback (most recent call last):

  File "D:\anaconda3\Lib\threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x000001C5B5FADDA0>
    └ <Thread(waitress-0, started daemon 19760)>
  File "D:\anaconda3\Lib\threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x000001C5B5FADA80>
    └ <Thread(waitress-0, started daemon 19760)>
  File "D:\anaconda3\Lib\threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(waitress-0, started daemon 19760)>
    │    │        │    └ (0,)
    │    │        └ <Thread(waitress-0, started daemon 19760)>
    │    └ <bound method ThreadedTaskDispatcher.handler_thread of <waitress.task.ThreadedTaskDispatcher object at 0x000001C5BA01E8D0>>
    └ <Thread(waitress-0, started daemon 19760)>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 84, in handler_thread
    task.service()
    │    └ <function HTTPChannel.service at 0x000001C5B934B380>
    └ <waitress.channel.HTTPChannel connected 127.0.0.1:51551 at 0x1c5ba01fc90>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\channel.py", line 428, in service
    task.service()
    │    └ <function Task.service at 0x000001C5B934A3E0>
    └ <waitress.task.WSGITask object at 0x000001C5BA01F910>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 168, in service
    self.execute()
    │    └ <function WSGITask.execute at 0x000001C5B934A980>
    └ <waitress.task.WSGITask object at 0x000001C5BA01F910>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 434, in execute
    app_iter = self.channel.server.application(environ, start_response)
               │    │       │      │           │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BA020900>
               │    │       │      │           └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
               │    │       │      └ <Flask 'pandora.bots.server'>
               │    │       └ <waitress.server.TcpWSGIServer listening 0.0.0.0:8008 at 0x1c5ba01f890>
               │    └ <waitress.channel.HTTPChannel connected 127.0.0.1:51551 at 0x1c5ba01fc90>
               └ <waitress.task.WSGITask object at 0x000001C5BA01F910>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 2552, in __call__
    return self.wsgi_app(environ, start_response)
           │    │        │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BA020900>
           │    │        └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
           │    └ <werkzeug.middleware.proxy_fix.ProxyFix object at 0x000001C5B9F29910>
           └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\werkzeug\middleware\proxy_fix.py", line 187, in __call__
    return self.app(environ, start_response)
           │    │   │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BA020900>
           │    │   └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
           │    └ <bound method Flask.wsgi_app of <Flask 'pandora.bots.server'>>
           └ <werkzeug.middleware.proxy_fix.ProxyFix object at 0x000001C5B9F29910>
> File "D:\anaconda3\Lib\site-packages\flask\app.py", line 2529, in wsgi_app
    response = self.full_dispatch_request()
               │    └ <function Flask.full_dispatch_request at 0x000001C5B92C82C0>
               └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1825, in full_dispatch_request
    rv = self.handle_user_exception(e)
         │    └ <function CORS.init_app.<locals>._after_request_decorator.<locals>.wrapped_function at 0x000001C5B9FCEAC0>
         └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask_cors\extension.py", line 165, in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
           │                  │   │             │  │       └ {}
           │                  │   │             │  └ (KeyError('messages'),)
           │                  │   │             └ <bound method Flask.handle_user_exception of <Flask 'pandora.bots.server'>>
           │                  │   └ <function Flask.make_response at 0x000001C5B92C87C0>
           │                  └ <Flask 'pandora.bots.server'>
           └ <function make_after_request_function.<locals>.cors_after_request at 0x000001C5B9F34220>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1823, in full_dispatch_request
    rv = self.dispatch_request()
         │    └ <function Flask.dispatch_request at 0x000001C5B92C8220>
         └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1799, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
           │    │           │    │              │    │            └ {}
           │    │           │    │              │    └ 'talk'
           │    │           │    │              └ <Rule '/api/conversation/talk' (POST, OPTIONS) -> talk>
           │    │           │    └ {'static': <function Flask.__init__.<locals>.<lambda> at 0x000001C5B9F03BA0>, 'fake_check': <function ChatBot.fake_check at 0...
           │    │           └ <Flask 'pandora.bots.server'>
           │    └ <function Flask.ensure_sync at 0x000001C5B92C8540>
           └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\pandora_web-1.3.2-py3.11.egg\pandora\bots\server.py", line 799, in talk
    *self.chatgpt.talk(payload, stream,
     │    │       │    │        └ True
     │    │       │    └ {'prompt': 'Hi', 'message_id': '7a01e8f0-6008-4ee7-bad9-41e703a67d19', 'parent_message_id': '2076bea7-ce5e-443b-a325-f76f8f43...
     │    │       └ <function ChatGPT.talk at 0x000001C5B9E2CB80>
     │    └ <pandora.openai.api.ChatGPT object at 0x000001C5B9F27A90>
     └ <pandora.bots.server.ChatBot object at 0x000001C5B9F28A10>
  File "D:\anaconda3\Lib\site-packages\pandora_web-1.3.2-py3.11.egg\pandora\openai\api.py", line 933, in talk
    'metadata': payload['messages'][0].get('metadata', {}),
                └ {'prompt': 'Hi', 'message_id': '7a01e8f0-6008-4ee7-bad9-41e703a67d19', 'parent_message_id': '2076bea7-ce5e-443b-a325-f76f8f43...

KeyError: 'messages'
2024-04-04 16:56:17.977 | ERROR    | logging:callHandlers:1706 - 127.0.0.1  |  /api/conversation/talk  |  500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
2024-04-04 16:56:31.810 | ERROR    | logging:callHandlers:1706 - Exception on /api/conversation/regenerate [POST]
Traceback (most recent call last):

  File "D:\anaconda3\Lib\threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x000001C5B5FADDA0>
    └ <Thread(waitress-1, started daemon 18616)>
  File "D:\anaconda3\Lib\threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x000001C5B5FADA80>
    └ <Thread(waitress-1, started daemon 18616)>
  File "D:\anaconda3\Lib\threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(waitress-1, started daemon 18616)>
    │    │        │    └ (1,)
    │    │        └ <Thread(waitress-1, started daemon 18616)>
    │    └ <bound method ThreadedTaskDispatcher.handler_thread of <waitress.task.ThreadedTaskDispatcher object at 0x000001C5BA01E8D0>>
    └ <Thread(waitress-1, started daemon 18616)>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 84, in handler_thread
    task.service()
    │    └ <function HTTPChannel.service at 0x000001C5B934B380>
    └ <waitress.channel.HTTPChannel connected 127.0.0.1:51551 at 0x1c5ba01fc90>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\channel.py", line 428, in service
    task.service()
    │    └ <function Task.service at 0x000001C5B934A3E0>
    └ <waitress.task.WSGITask object at 0x000001C5BA01F990>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 168, in service
    self.execute()
    │    └ <function WSGITask.execute at 0x000001C5B934A980>
    └ <waitress.task.WSGITask object at 0x000001C5BA01F990>
  File "D:\anaconda3\Lib\site-packages\waitress-2.1.2-py3.11.egg\waitress\task.py", line 434, in execute
    app_iter = self.channel.server.application(environ, start_response)
               │    │       │      │           │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BB12CB80>
               │    │       │      │           └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
               │    │       │      └ <Flask 'pandora.bots.server'>
               │    │       └ <waitress.server.TcpWSGIServer listening 0.0.0.0:8008 at 0x1c5ba01f890>
               │    └ <waitress.channel.HTTPChannel connected 127.0.0.1:51551 at 0x1c5ba01fc90>
               └ <waitress.task.WSGITask object at 0x000001C5BA01F990>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 2552, in __call__
    return self.wsgi_app(environ, start_response)
           │    │        │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BB12CB80>
           │    │        └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
           │    └ <werkzeug.middleware.proxy_fix.ProxyFix object at 0x000001C5B9F29910>
           └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\werkzeug\middleware\proxy_fix.py", line 187, in __call__
    return self.app(environ, start_response)
           │    │   │        └ <function WSGITask.execute.<locals>.start_response at 0x000001C5BB12CB80>
           │    │   └ {'REMOTE_ADDR': '127.0.0.1', 'REMOTE_HOST': '127.0.0.1', 'REMOTE_PORT': '51551', 'REQUEST_METHOD': 'POST', 'SERVER_PORT': '80...
           │    └ <bound method Flask.wsgi_app of <Flask 'pandora.bots.server'>>
           └ <werkzeug.middleware.proxy_fix.ProxyFix object at 0x000001C5B9F29910>
> File "D:\anaconda3\Lib\site-packages\flask\app.py", line 2529, in wsgi_app
    response = self.full_dispatch_request()
               │    └ <function Flask.full_dispatch_request at 0x000001C5B92C82C0>
               └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1825, in full_dispatch_request
    rv = self.handle_user_exception(e)
         │    └ <function CORS.init_app.<locals>._after_request_decorator.<locals>.wrapped_function at 0x000001C5B9FCEAC0>
         └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask_cors\extension.py", line 165, in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
           │                  │   │             │  │       └ {}
           │                  │   │             │  └ (KeyError('messages'),)
           │                  │   │             └ <bound method Flask.handle_user_exception of <Flask 'pandora.bots.server'>>
           │                  │   └ <function Flask.make_response at 0x000001C5B92C87C0>
           │                  └ <Flask 'pandora.bots.server'>
           └ <function make_after_request_function.<locals>.cors_after_request at 0x000001C5B9F34220>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1823, in full_dispatch_request
    rv = self.dispatch_request()
         │    └ <function Flask.dispatch_request at 0x000001C5B92C8220>
         └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\flask\app.py", line 1799, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
           │    │           │    │              │    │            └ {}
           │    │           │    │              │    └ 'regenerate'
           │    │           │    │              └ <Rule '/api/conversation/regenerate' (POST, OPTIONS) -> regenerate>
           │    │           │    └ {'static': <function Flask.__init__.<locals>.<lambda> at 0x000001C5B9F03BA0>, 'fake_check': <function ChatBot.fake_check at 0...
           │    │           └ <Flask 'pandora.bots.server'>
           │    └ <function Flask.ensure_sync at 0x000001C5B92C8540>
           └ <Flask 'pandora.bots.server'>
  File "D:\anaconda3\Lib\site-packages\pandora_web-1.3.2-py3.11.egg\pandora\bots\server.py", line 817, in regenerate
    return self.talk()
           │    └ <function ChatBot.talk at 0x000001C5B9E2F060>
           └ <pandora.bots.server.ChatBot object at 0x000001C5B9F28A10>
  File "D:\anaconda3\Lib\site-packages\pandora_web-1.3.2-py3.11.egg\pandora\bots\server.py", line 799, in talk
    *self.chatgpt.talk(payload, stream,
     │    │       │    │        └ True
     │    │       │    └ {'prompt': 'Hi', 'message_id': '7a01e8f0-6008-4ee7-bad9-41e703a67d19', 'parent_message_id': '2076bea7-ce5e-443b-a325-f76f8f43...
     │    │       └ <function ChatGPT.talk at 0x000001C5B9E2CB80>
     │    └ <pandora.openai.api.ChatGPT object at 0x000001C5B9F27A90>
     └ <pandora.bots.server.ChatBot object at 0x000001C5B9F28A10>
  File "D:\anaconda3\Lib\site-packages\pandora_web-1.3.2-py3.11.egg\pandora\openai\api.py", line 933, in talk
    'metadata': payload['messages'][0].get('metadata', {}),
                └ {'prompt': 'Hi', 'message_id': '7a01e8f0-6008-4ee7-bad9-41e703a67d19', 'parent_message_id': '2076bea7-ce5e-443b-a325-f76f8f43...

KeyError: 'messages'
2024-04-04 16:56:31.826 | ERROR    | logging:callHandlers:1706 - 127.0.0.1  |  /api/conversation/regenerate  |  500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
2024-04-04 16:56:32.324 | ERROR    | logging:callHandlers:1706 - 127.0.0.1  |  /api/conversation/gen_title/  |  405 Method Not Allowed: The method is not allowed for the requested URL.
GavinGoo commented 3 months ago

抱歉抱歉,之前修改的时候漏了对旧ui的兼容项。稍后修复

PlayMcBKuwu commented 3 months ago

btw 什么时候加 Vercel 部署2024年4月4日 17:20,GavinGoo @.***>写道: 抱歉抱歉,之前修改的时候漏了对旧ui的兼容项。稍后修复

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***>

GavinGoo commented 3 months ago

已修复

GavinGoo commented 3 months ago

btw 什么时候加 Vercel 部署2024年4月4日 17:20,GavinGoo @.***>写道: 抱歉抱歉,之前修改的时候漏了对旧ui的兼容项。稍后修复

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***>

之前有事忙,只是随手写了个vercel.json以及相关尝试适配项,一直还没测试,这几天看看吧