tian-minghui / openai-style-api

支持将openai、claude、azure openai, gemini,kimi, 智谱AI,通义千问,讯飞星火API等模型服务方的调用转为openai方式调用。屏蔽不同大模型API的差异,统一用openai api标准格式使用大模型(Shield the differences between different large model APIs and use large models in a unified openai API standard format)
MIT License
328 stars 54 forks source link

为什么讯飞星火转openai,之后不能用autogpt呢 #10

Open xingwozhonghua126 opened 5 months ago

xingwozhonghua126 commented 5 months ago

为什么讯飞星火转openai,之后不能用autogpt呢。 显示端口占用?

xingwozhonghua126 commented 5 months ago

image

xingwozhonghua126 commented 5 months ago

image 端口改了,但是连不上。 image image image

tian-minghui commented 5 months ago

curl 能调用通么?

xingwozhonghua126 commented 5 months ago

curl 能调用通么?

tian-minghui commented 5 months ago

image

curl能调用通的话,肯定是别的地方出问题了,我看显示8080端口被占用,我看你本项目你启动使用的是8091端口,自行排查下8080端口是否有被其他程序占用

xingwozhonghua126 commented 5 months ago

image

curl能调用通的话,肯定是别的地方出问题了,我看显示8080端口被占用,我看你本项目你启动使用的是8091端口,自行排查下8080端口是否有被其他程序占用

image

只有第一次测试的时候好使,之后就不行了 可能是用完autogpt就不行了

tian-minghui commented 5 months ago

image

curl能调用通的话,肯定是别的地方出问题了,我看显示8080端口被占用,我看你本项目你启动使用的是8091端口,自行排查下8080端口是否有被其他程序占用

image

只有第一次测试的时候好使,之后就不行了 可能是用完autogpt就不行了

curl错误的话,你贴一下 server 的报错日志看下

xingwozhonghua126 commented 5 months ago

image

curl能调用通的话,肯定是别的地方出问题了,我看显示8080端口被占用,我看你本项目你启动使用的是8091端口,自行排查下8080端口是否有被其他程序占用

image 只有第一次测试的时候好使,之后就不行了 可能是用完autogpt就不行了

curl错误的话,你贴一下 server 的报错日志看下

sercer的日志目录在哪呀,我用docker启动的,直接输出到命令行吗?

xingwozhonghua126 commented 5 months ago

openai-key-1  | INFO:     47.245.85.62:33374 - "POST /v1/chat/completions HTTP/1.1" 200 OK
openai-key-1  | INFO:     111.41.1.47:8278 - "GET /v1/models HTTP/1.1" 404 Not Found
openai-key-1  | 2024-03-10 01:52:27.228 | INFO     | __main__:check_api_key:45 - auth: scheme='Bearer' credentials='xunfei-spark-api-7c7aa4a3549f12'
openai-key-1  | 2024-03-10 01:52:27.230 | INFO     | __main__:create_chat_completion:105 - request: model='gpt-3.5-turbo' messages=[ChatMessage(role='system', content='You are a helpful assistant.', function_call=None), ChatMessage(role='user', content='Hello!', function_call=None)] functions=None temperature=None top_p=None max_length=None stream=False stop=None,  model: <adapters.xunfei_spark.XunfeiSparkAPIModel object at 0x7f923c2c9360>
openai-key-1  | INFO:     47.245.85.62:33400 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
openai-key-1  | ERROR:    Exception in ASGI application
openai-key-1  | Traceback (most recent call last):
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
openai-key-1  |     result = await app(  # type: ignore[func-returns-value]
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
openai-key-1  |     return await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 292, in __call__
openai-key-1  |     await super().__call__(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
openai-key-1  |     await self.middleware_stack(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
openai-key-1  |     raise exc
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
openai-key-1  |     await self.app(scope, receive, _send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
openai-key-1  |     raise exc
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
openai-key-1  |     await self.app(scope, receive, sender)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
openai-key-1  |     raise e
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
openai-key-1  |     await route.handle(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
openai-key-1  |     response = await func(request)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 273, in app
openai-key-1  |     raw_response = await run_endpoint_function(
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 192, in run_endpoint_function
openai-key-1  |     return await run_in_threadpool(dependant.call, **values)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
openai-key-1  |     return await anyio.to_thread.run_sync(func, *args)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
openai-key-1  |     return await get_asynclib().run_sync_in_worker_thread(
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
openai-key-1  |     return await future
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
openai-key-1  |     result = context.run(func, *args)
openai-key-1  |   File "/app/open-api.py", line 110, in create_chat_completion
openai-key-1  |     openai_response = next(resp)
openai-key-1  |   File "/app/adapters/xunfei_spark.py", line 51, in chat_completions
openai-key-1  |     openai_response = self.client_response_to_chatgpt_response(iter_content)
openai-key-1  |   File "/app/adapters/xunfei_spark.py", line 113, in client_response_to_chatgpt_response
openai-key-1  |     for resp_json in iter_resp:
openai-key-1  |   File "/app/clients/xunfei_spark/api/spark_api.py", line 130, in get_resp_from_messages
openai-key-1  |     wss = self.create_wss_connection()
openai-key-1  |   File "/app/clients/xunfei_spark/api/spark_api.py", line 105, in create_wss_connection
openai-key-1  |     return ws_connect(self._wss_url)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/websockets/sync/client.py", line 289, in connect
openai-key-1  |     connection.handshake(
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/websockets/sync/client.py", line 98, in handshake
openai-key-1  |     raise self.protocol.handshake_exc
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/websockets/client.py", line 336, in parse
openai-key-1  |     self.process_response(response)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/websockets/client.py", line 150, in process_response
openai-key-1  |     raise InvalidStatus(response)
openai-key-1  | websockets.exceptions.InvalidStatus: server rejected WebSocket connection: HTTP 403
xingwozhonghua126 commented 5 months ago

我用了bing也是一样的错误

openai-key-1  | INFO:     Started server process [1]
openai-key-1  | INFO:     Waiting for application startup.
openai-key-1  | INFO:     Application startup complete.
openai-key-1  | INFO:     Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
openai-key-1  | 2024-03-10 02:04:06.260 | INFO     | __main__:check_api_key:45 - auth: scheme='Bearer' credentials='bing-7c7aa4a3549f5'
openai-key-1  | 2024-03-10 02:04:06.261 | INFO     | __main__:create_chat_completion:105 - request: model='gpt-3.5-turbo' messages=[ChatMessage(role='system', content='You are a helpful assistant.', function_call=None), ChatMessage(role='user', content='Hello!', function_call=None)] functions=None temperature=None top_p=None max_length=None stream=False stop=None,  model: <adapters.bing_sydney.BingSydneyModel object at 0x7f2940b15cf0>
openai-key-1  | 2024-03-10 02:04:06.262 | WARNING  | adapters.bing_sydney:convertOpenAIParams2Prompt:53 - 暂不支持对话历史,取最近一条对话记录:Hello!
openai-key-1  | 2024-03-10 02:04:06.262 | INFO     | adapters.bing_sydney:__chat_help:58 - prompt:Hello!
openai-key-1  | INFO:     47.245.85.62:33442 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
openai-key-1  | ERROR:    Exception in ASGI application
openai-key-1  | Traceback (most recent call last):
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
openai-key-1  |     result = await app(  # type: ignore[func-returns-value]
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
openai-key-1  |     return await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 292, in __call__
openai-key-1  |     await super().__call__(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
openai-key-1  |     await self.middleware_stack(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
openai-key-1  |     raise exc
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
openai-key-1  |     await self.app(scope, receive, _send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
openai-key-1  |     raise exc
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
openai-key-1  |     await self.app(scope, receive, sender)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
openai-key-1  |     raise e
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
openai-key-1  |     await route.handle(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
openai-key-1  |     await self.app(scope, receive, send)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
openai-key-1  |     response = await func(request)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 273, in app
openai-key-1  |     raw_response = await run_endpoint_function(
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 192, in run_endpoint_function
openai-key-1  |     return await run_in_threadpool(dependant.call, **values)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
openai-key-1  |     return await anyio.to_thread.run_sync(func, *args)
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
openai-key-1  |     return await get_asynclib().run_sync_in_worker_thread(
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
openai-key-1  |     return await future
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
openai-key-1  |     result = context.run(func, *args)
openai-key-1  |   File "/app/open-api.py", line 110, in create_chat_completion
openai-key-1  |     openai_response = next(resp)
openai-key-1  |   File "/app/adapters/bing_sydney.py", line 41, in chat_completions
openai-key-1  |     result = asyncio.run(async_gen)
openai-key-1  |   File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run
openai-key-1  |     return loop.run_until_complete(main)
openai-key-1  |   File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
openai-key-1  |   File "/app/adapters/bing_sydney.py", line 59, in __chat_help
openai-key-1  |     async with SydneyClient(self.style, self.cookie, self.proxy) as client:
openai-key-1  |   File "/app/clients/sydney/sydney.py", line 96, in __aenter__
openai-key-1  |     await self.start_conversation()
openai-key-1  |   File "/app/clients/sydney/sydney.py", line 467, in start_conversation
openai-key-1  |     response_dict = await response.json()
openai-key-1  |   File "/usr/local/lib/python3.10/site-packages/aiohttp/client_reqrep.py", line 1104, in json
openai-key-1  |     raise ContentTypeError(
openai-key-1  | aiohttp.client_exceptions.ContentTypeError: 0, message='Attempt to decode JSON with unexpected mimetype: ', url=URL('https://edgeservices.bing.com/edgesvc/turing/conversation/create?bundleVersion=1.1381.12')