reworkd / AgentGPT

🤖 Assemble, configure, and deploy autonomous AI Agents in your browser.
https://agentgpt.reworkd.ai
GNU General Public License v3.0
31.96k stars 9.26k forks source link

Request failed with status code 409 #1028

Open VectorZhao opened 1 year ago

VectorZhao commented 1 year ago

Please check that this issue hasn't been reported before.

Expected Behavior

.env

# Platform Config: (max loops determines how many times the agent may execute)
NEXT_PUBLIC_MAX_LOOPS="100"
REWORKD_PLATFORM_MAX_LOOPS=${NEXT_PUBLIC_MAX_LOOPS}

# Deployment Environment:
NODE_ENV="development"
NEXT_PUBLIC_VERCEL_ENV="${NODE_ENV}"

# NextJS:
NEXT_PUBLIC_BACKEND_URL="http://127.0.0.1:8000"

# Next Auth config:
NEXTAUTH_SECRET=changeme
NEXTAUTH_URL="http://127.0.0.1:3000"

# Auth providers (Use if you want to get out of development mode sign-in):
GOOGLE_CLIENT_ID=""
GOOGLE_CLIENT_SECRET=""
GITHUB_CLIENT_ID="***"
GITHUB_CLIENT_SECRET="***"
DISCORD_CLIENT_SECRET="***"
DISCORD_CLIENT_ID="***"

# Backend:
REWORKD_PLATFORM_ENVIRONMENT="${NODE_ENV}"
REWORKD_PLATFORM_FF_MOCK_MODE_ENABLED="false"
REWORKD_PLATFORM_OPENAI_API_KEY="azure-api-key"
REWORKD_PLATFORM_FRONTEND_URL="http://127.0.0.1:3000"
REWORKD_PLATFORM_RELOAD="true"
REWORKD_PLATFORM_OPENAI_API_BASE="https://azproxy.xxx.xxxx/v1"
REWORKD_PLATFORM_SERP_API_KEY="changeme"
REWORKD_PLATFORM_REPLICATE_API_KEY="changeme"

# Database (Backend):
REWORKD_PLATFORM_DATABASE_USER="reworkd_platform"
REWORKD_PLATFORM_DATABASE_PASSWORD="reworkd_platform"
REWORKD_PLATFORM_DATABASE_HOST="db"
REWORKD_PLATFORM_DATABASE_PORT="3307"
REWORKD_PLATFORM_DATABASE_NAME="reworkd_platform"
REWORKD_PLATFORM_DATABASE_URL="mysql://${REWORKD_PLATFORM_DATABASE_USER}:${REWORKD_PLATFORM_DATABASE_PASSWORD}@${REWORKD_PLATFORM_DATABASE_HOST}:${REWORKD_PLATFORM_DATABASE_PORT}/${REWORKD_PLATFORM_DATABASE_NAME}"

# Database (Frontend):
DATABASE_USER="reworkd_platform"
DATABASE_PASSWORD="reworkd_platform"
DATABASE_HOST="db"
DATABASE_PORT="3307"
DATABASE_NAME="reworkd_platform"
DATABASE_URL="mysql://${DATABASE_USER}:${DATABASE_PASSWORD}@${DATABASE_HOST}:${DATABASE_PORT}/${DATABASE_NAME}"

docker-compose.yml

version: '3.9'

services:
  next:
    container_name: next
    build:
      context: ./next
      dockerfile: Dockerfile
    ports:
      - "3000:3000"
    volumes:
      - ./next/.env:/next/.env
      - ./next/:/next/
      - /next/node_modules
      - /next/.next

  platform:
    container_name: platform
    build:
      context: ./platform
      target: prod
    ports:
      - "8000:8000"
    restart: always
    volumes:
      - ./platform:/app/src/
    env_file:
      - next/.env
    environment:
      REWORKD_PLATFORM_HOST: 0.0.0.0
      REWORKD_PLATFORM_DB_HOST: db
      REWORKD_PLATFORM_DB_PORT: "3307"
      REWORKD_PLATFORM_DB_USER: "reworkd_platform"
      REWORKD_PLATFORM_DB_PASS: "reworkd_platform"
      REWORKD_PLATFORM_DB_BASE: "reworkd_platform"
    depends_on:
      - db

  db:
    image: mysql:8.0
    container_name: db
    restart: always
    build:
      context: ./db
    ports:
      - "3307:3307"
    environment:
      MYSQL_DATABASE: "reworkd_platform"
      MYSQL_USER: "reworkd_platform"
      MYSQL_PASSWORD: "reworkd_platform"
      MYSQL_ROOT_PASSWORD: "reworkd_platform"
      MYSQL_TCP_PORT: 3307
    volumes:
      - db_data:/var/lib/mysql
    command: [ 'mysqld', '--character-set-server=utf8mb4', '--collation-server=utf8mb4_unicode_ci' ]

  weaviate:
    image: semitechnologies/weaviate:1.19.6
    restart: on-failure:0
    ports:
      - "8080:8080"
    environment:
      QUERY_DEFAULTS_LIMIT: 100
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
      PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
      DEFAULT_VECTORIZER_MODULE: 'none'
      CLUSTER_HOSTNAME: 'node1'
    volumes:
      - weaviate:/var/lib/weaviate

volumes:
  weaviate:
  db_data:

Current behaviour

logs

next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`status`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?,?)
next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`status`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?,?)
next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`status`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?,?)
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
platform    | 2023-07-12 02:25:12.429 | INFO     | logging:callHandlers:1706 - message='OpenAI API response' path=https://azproxy.xxx.xxxx/v1/chat/completions processing_ms=21.8629 request_id=5ee9c373-74a3-4c88-8fc5-67fed7c6d574 response_code=400
platform    | 2023-07-12 02:25:12.429 | INFO     | logging:callHandlers:1706 - error_code=None error_message='Unrecognized request argument supplied: functions' error_param=None error_type=invalid_request_error message='OpenAI API error received' stream_error=False
platform    | 2023-07-12 02:25:12.449 | ERROR    | reworkd_platform.web.api.error_handling:platformatic_exception_handler:13 - Unrecognized request argument supplied: functions
platform    | Traceback (most recent call last):
platform    | 
platform    |   File "/app/src/reworkd_platform/web/api/agent/helpers.py", line 32, in openai_error_handler
platform    |     return await func(*args, **kwargs)
platform    |                  │     │       └ {'messages': [HumanMessage(content='\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "确定旅行时间和...
platform    |                  │     └ ()
platform    |                  â”” <bound method BaseChatModel.apredict_messages of WrappedChatOpenAI(verbose=False, callbacks=None, callback_manager=None, clie...
platform    | 
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 270, in apredict_messages
platform    |     return await self._call_async(messages, stop=_stop, **kwargs)
platform    |                  │    │           │              │        └ {'functions': [{'name': 'image', 'description': 'Used to sketch, draw, or generate an image.', 'parameters': {'type': 'object...
platform    |                  │    │           │              └ None
platform    |                  │    │           └ [HumanMessage(content='\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "确定旅行时间和预算。"\n\n    B...
platform    |                  │    └ <function BaseChatModel._call_async at 0x7f8dbc610fe0>
platform    |                  â”” WrappedChatOpenAI(verbose=False, callbacks=None, callback_manager=None, client=<class 'openai.api_resources.chat_completion.C...
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 210, in _call_async
platform    |     result = await self.agenerate(
platform    |                    │    └ <function BaseChatModel.agenerate at 0x7f8dbc610c20>
platform    |                    â”” WrappedChatOpenAI(verbose=False, callbacks=None, callback_manager=None, client=<class 'openai.api_resources.chat_completion.C...
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 137, in agenerate
platform    |     raise e
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 127, in agenerate
platform    |     results = await asyncio.gather(
platform    |                     │       └ <function gather at 0x7f8dd4e99760>
platform    |                     â”” <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'>
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 385, in _agenerate
platform    |     response = await acompletion_with_retry(
platform    |                      â”” <function acompletion_with_retry at 0x7f8dbc612660>
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 92, in acompletion_with_retry
platform    |     return await _completion_with_retry(**kwargs)
platform    |                  │                        └ {'messages': [{'role': 'user', 'content': '\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "...
platform    |                  â”” <function acompletion_with_retry.<locals>._completion_with_retry at 0x7f8db55ef2e0>
platform    |   File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
platform    |     return await fn(*args, **kwargs)
platform    |                  │   │       └ {'messages': [{'role': 'user', 'content': '\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "...
platform    |                  │   └ ()
platform    |                  â”” <function acompletion_with_retry.<locals>._completion_with_retry at 0x7f8db55ef1a0>
platform    |   File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in __call__
platform    |     do = self.iter(retry_state=retry_state)
platform    |          │    │                └ <RetryCallState 140246623782096: attempt #1; slept for 0.0; last result: failed (InvalidRequestError Unrecognized request arg...
platform    |          │    └ <function BaseRetrying.iter at 0x7f8dbe306480>
platform    |          â”” <AsyncRetrying object at 0x7f8db55c5810 (stop=<tenacity.stop.stop_after_attempt object at 0x7f8db55c59d0>, wait=<tenacity.wai...
platform    |   File "/usr/local/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
platform    |     return fut.result()
platform    |            │   └ <function Future.result at 0x7f8dd54d1da0>
platform    |            â”” <Future at 0x7f8db641b150 state=finished raised InvalidRequestError>
platform    |   File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 449, in result
platform    |     return self.__get_result()
platform    |            â”” None
platform    |   File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
platform    |     raise self._exception
platform    |           â”” None
platform    |   File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
platform    |     result = await fn(*args, **kwargs)
platform    |                    │   │       └ {'messages': [{'role': 'user', 'content': '\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "...
platform    |                    │   └ ()
platform    |                    â”” <function acompletion_with_retry.<locals>._completion_with_retry at 0x7f8db55ef060>
platform    |   File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 90, in _completion_with_retry
platform    |     return await llm.client.acreate(**kwargs)
platform    |                  │   │      │         └ {'messages': [{'role': 'user', 'content': '\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "...
platform    |                  │   │      └ <classmethod(<function ChatCompletion.acreate at 0x7f8dd2e7b560>)>
platform    |                  │   └ <class 'openai.api_resources.chat_completion.ChatCompletion'>
platform    |                  â”” WrappedChatOpenAI(verbose=False, callbacks=None, callback_manager=None, client=<class 'openai.api_resources.chat_completion.C...
platform    |   File "/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 45, in acreate
platform    |     return await super().acreate(*args, **kwargs)
platform    |                                   │       └ {'messages': [{'role': 'user', 'content': '\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task: "...
platform    |                                   â”” ()
platform    |   File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 217, in acreate
platform    |     response, _, api_key = await requestor.arequest(
platform    |                                  │         └ <function APIRequestor.arequest at 0x7f8dd2e78040>
platform    |                                  â”” <openai.api_requestor.APIRequestor object at 0x7f8db62bb950>
platform    |   File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 382, in arequest
platform    |     resp, got_stream = await self._interpret_async_response(result, stream)
platform    |                              │    │                         │       └ False
platform    |                              │    │                         └ <ClientResponse(https://azproxy.xxx.xxxx/v1/chat/completions) [400 Bad Request]>
platform    |                              │    │                           <CIMultiDictProxy('Date': 'Wed, 12 Jul 2023 ...
platform    |                              │    └ <function APIRequestor._interpret_async_response at 0x7f8dd2e78540>
platform    |                              â”” <openai.api_requestor.APIRequestor object at 0x7f8db62bb950>
platform    |   File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 726, in _interpret_async_response
platform    |     self._interpret_response_line(
platform    |     │    └ <function APIRequestor._interpret_response_line at 0x7f8dd2e785e0>
platform    |     â”” <openai.api_requestor.APIRequestor object at 0x7f8db62bb950>
platform    |   File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line
platform    |     raise self.handle_error_response(
platform    |           │    └ <function APIRequestor.handle_error_response at 0x7f8dd2e780e0>
platform    |           â”” <openai.api_requestor.APIRequestor object at 0x7f8db62bb950>
platform    | 
platform    | openai.error.InvalidRequestError: Unrecognized request argument supplied: functions
platform    | 
platform    | 
platform    | During handling of the above exception, another exception occurred:
platform    | 
platform    | 
platform    | Traceback (most recent call last):
platform    | 
platform    |   File "<string>", line 1, in <module>
platform    |   File "/usr/local/lib/python3.11/multiprocessing/spawn.py", line 120, in spawn_main
platform    |     exitcode = _main(fd, parent_sentinel)
platform    |                │     │   └ 4
platform    |                │     └ 7
platform    |                â”” <function _main at 0x7f8dd56edbc0>
platform    |   File "/usr/local/lib/python3.11/multiprocessing/spawn.py", line 133, in _main
platform    |     return self._bootstrap(parent_sentinel)
platform    |            │    │          └ 4
platform    |            │    └ <function BaseProcess._bootstrap at 0x7f8dd55f4540>
platform    |            â”” <SpawnProcess name='SpawnProcess-1' parent=1 started>
platform    |   File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
platform    |     self.run()
platform    |     │    └ <function BaseProcess.run at 0x7f8dd57e7a60>
platform    |     â”” <SpawnProcess name='SpawnProcess-1' parent=1 started>
platform    |   File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run
platform    |     self._target(*self._args, **self._kwargs)
platform    |     │    │        │    │        │    └ {'config': <uvicorn.config.Config object at 0x7f8dd56ebfd0>, 'target': <bound method Server.run of <uvicorn.server.Server obj...
platform    |     │    │        │    │        └ <SpawnProcess name='SpawnProcess-1' parent=1 started>
platform    |     │    │        │    └ ()
platform    |     │    │        └ <SpawnProcess name='SpawnProcess-1' parent=1 started>
platform    |     │    └ <function subprocess_started at 0x7f8dd4caf240>
platform    |     â”” <SpawnProcess name='SpawnProcess-1' parent=1 started>
platform    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started
platform    |     target(sockets=sockets)
platform    |     │              └ [<socket.socket fd=3, family=2, type=1, proto=0, laddr=('0.0.0.0', 8000)>]
platform    |     â”” <bound method Server.run of <uvicorn.server.Server object at 0x7f8dd49bd410>>
platform    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 61, in run
platform    |     return asyncio.run(self.serve(sockets=sockets))
platform    |            │       │   │    │             └ [<socket.socket fd=3, family=2, type=1, proto=0, laddr=('0.0.0.0', 8000)>]
platform    |            │       │   │    └ <function Server.serve at 0x7f8dd4cae2a0>
platform    |            │       │   └ <uvicorn.server.Server object at 0x7f8dd49bd410>
platform    |            │       └ <function run at 0x7f8dd4e40fe0>
platform    |            â”” <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'>
platform    |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
platform    |     return runner.run(main)
platform    |            │      │   └ <coroutine object Server.serve at 0x7f8dd49b7780>
platform    |            │      └ <function Runner.run at 0x7f8dd4eb4860>
platform    |            â”” <asyncio.runners.Runner object at 0x7f8dd49bcd10>
platform    |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
platform    |     return self._loop.run_until_complete(task)
platform    |            │    │     │                  └ <Task pending name='Task-1' coro=<Server.serve() running at /usr/local/lib/python3.11/site-packages/uvicorn/server.py:81> wai...
platform    |            │    │     └ <method 'run_until_complete' of 'uvloop.loop.Loop' objects>
platform    |            │    └ <uvloop.Loop running=True closed=False debug=False>
platform    |            â”” <asyncio.runners.Runner object at 0x7f8dd49bcd10>
platform    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
platform    |     result = await app(  # type: ignore[func-returns-value]
platform    |                    â”” <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7f8db641ad50>
platform    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
platform    |     return await self.app(scope, receive, send)
platform    |                  │    │   │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db652f4...
platform    |                  │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |                  │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |                  │    └ <fastapi.applications.FastAPI object at 0x7f8dbe64be50>
platform    |                  â”” <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7f8db641ad50>
platform    |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 284, in __call__
platform    |     await super().__call__(scope, receive, send)
platform    |                            │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db652f4...
platform    |                            │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |                            â”” {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 122, in __call__
platform    |     await self.middleware_stack(scope, receive, send)
platform    |           │    │                │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db652f4...
platform    |           │    │                │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │                └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7f8db6469f50>
platform    |           â”” <fastapi.applications.FastAPI object at 0x7f8dbe64be50>
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in __call__
platform    |     await self.app(scope, receive, _send)
platform    |           │    │   │      │        └ <function ServerErrorMiddleware.__call__.<locals>._send at 0x7f8db6269260>
platform    |           │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <starlette.middleware.cors.CORSMiddleware object at 0x7f8db646bb50>
platform    |           â”” <starlette.middleware.errors.ServerErrorMiddleware object at 0x7f8db6469f50>
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in __call__
platform    |     await self.simple_response(scope, receive, send, request_headers=headers)
platform    |           │    │               │      │        │                     └ Headers({'host': '192.168.125.114:8000', 'connection': 'keep-alive', 'content-length': '293', 'accept': 'application/json, te...
platform    |           │    │               │      │        └ <function ServerErrorMiddleware.__call__.<locals>._send at 0x7f8db6269260>
platform    |           │    │               │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │               └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <function CORSMiddleware.simple_response at 0x7f8dd3b1ae80>
platform    |           â”” <starlette.middleware.cors.CORSMiddleware object at 0x7f8db646bb50>
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
platform    |     await self.app(scope, receive, send)
platform    |           │    │   │      │        └ functools.partial(<bound method CORSMiddleware.send of <starlette.middleware.cors.CORSMiddleware object at 0x7f8db646bb50>>, ...
platform    |           │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f8db81f4a50>
platform    |           â”” <starlette.middleware.cors.CORSMiddleware object at 0x7f8db646bb50>
platform    | > File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
platform    |     await self.app(scope, receive, sender)
platform    |           │    │   │      │        └ <function ExceptionMiddleware.__call__.<locals>.sender at 0x7f8db55d6480>
platform    |           │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f8db6469c10>
platform    |           â”” <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f8db81f4a50>
platform    |   File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
platform    |     raise e
platform    |   File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
platform    |     await self.app(scope, receive, send)
platform    |           │    │   │      │        └ <function ExceptionMiddleware.__call__.<locals>.sender at 0x7f8db55d6480>
platform    |           │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <fastapi.routing.APIRouter object at 0x7f8dbc99fe10>
platform    |           â”” <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f8db6469c10>
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 718, in __call__
platform    |     await route.handle(scope, receive, send)
platform    |           │     │      │      │        └ <function ExceptionMiddleware.__call__.<locals>.sender at 0x7f8db55d6480>
platform    |           │     │      │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │     │      └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │     └ <function Route.handle at 0x7f8dd3ce3920>
platform    |           â”” APIRoute(path='/api/agent/analyze', name='analyze_tasks', methods=['POST'])
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
platform    |     await self.app(scope, receive, send)
platform    |           │    │   │      │        └ <function ExceptionMiddleware.__call__.<locals>.sender at 0x7f8db55d6480>
platform    |           │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x7f8db65...
platform    |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.20.0.4', 8000), 'c...
platform    |           │    └ <function request_response.<locals>.app at 0x7f8db662aca0>
platform    |           â”” APIRoute(path='/api/agent/analyze', name='analyze_tasks', methods=['POST'])
platform    |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 66, in app
platform    |     response = await func(request)
platform    |                      │    └ <starlette.requests.Request object at 0x7f8db6387810>
platform    |                      â”” <function get_request_handler.<locals>.app at 0x7f8db662a3e0>
platform    |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 241, in app
platform    |     raw_response = await run_endpoint_function(
platform    |                          â”” <function run_endpoint_function at 0x7f8dd3ce2340>
platform    |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 167, in run_endpoint_function
platform    |     return await dependant.call(**values)
platform    |                  │         │      └ {'req_body': AgentTaskAnalyze(goal='Plan a detailed trip to Hawaii.', model_settings=ModelSettings(model='gpt-3.5-turbo', cus...
platform    |                  │         └ <function analyze_tasks at 0x7f8db65b6700>
platform    |                  â”” <fastapi.dependencies.models.Dependant object at 0x7f8db64dcf90>
platform    | 
platform    |   File "/app/src/reworkd_platform/web/api/agent/views.py", line 50, in analyze_tasks
platform    |     return await agent_service.analyze_task_agent(
platform    |                  │             └ <function OpenAIAgentService.analyze_task_agent at 0x7f8db826e5c0>
platform    |                  â”” <reworkd_platform.web.api.agent.agent_service.open_ai_agent_service.OpenAIAgentService object at 0x7f8db5576990>
platform    | 
platform    |   File "/app/src/reworkd_platform/web/api/agent/agent_service/open_ai_agent_service.py", line 105, in analyze_task_agent
platform    |     message = await openai_error_handler(
platform    |                     â”” <function openai_error_handler at 0x7f8db83a74c0>
platform    | 
platform    |   File "/app/src/reworkd_platform/web/api/agent/helpers.py", line 47, in openai_error_handler
platform    |     raise OpenAIError(e, e.user_message)
platform    |           â”” <class 'reworkd_platform.web.api.errors.OpenAIError'>
platform    | 
platform    | reworkd_platform.web.api.errors.OpenAIError: Unrecognized request argument supplied: functions
platform    | 2023-07-12 02:25:12.460 | INFO     | logging:callHandlers:1706 - 192.168.125.106:50973 - "POST /api/agent/analyze HTTP/1.1" 409

Steps to reproduce

run a agent, Request failed with status code 409 but i can chat with gpt. image

Possible solution

No response

Which Operating Systems are you using?

Acknowledgements

jasangill1 commented 1 year ago

Hello @VectorZhaoThe logs are a bit messy!! how would you replicate this?

VectorZhao commented 1 year ago

what should I do?

VectorZhao commented 1 year ago

When I run any agents, the error Request failed with status code 409 will always appear.

VectorZhao commented 1 year ago
next        | prisma:query BEGIN
next        | prisma:query BEGIN
next        | prisma:query BEGIN
next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`status`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?,?)
next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`status`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?,?)
next        | prisma:query INSERT INTO `reworkd_platform`.`AgentTask` (`id`,`agentId`,`type`,`value`,`sort`,`createDate`) VALUES (?,?,?,?,?,?)
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query SELECT `reworkd_platform`.`AgentTask`.`id`, `reworkd_platform`.`AgentTask`.`agentId`, `reworkd_platform`.`AgentTask`.`type`, `reworkd_platform`.`AgentTask`.`status`, `reworkd_platform`.`AgentTask`.`value`, `reworkd_platform`.`AgentTask`.`info`, `reworkd_platform`.`AgentTask`.`sort`, `reworkd_platform`.`AgentTask`.`deleteDate`, `reworkd_platform`.`AgentTask`.`createDate` FROM `reworkd_platform`.`AgentTask` WHERE `reworkd_platform`.`AgentTask`.`id` = ? LIMIT ? OFFSET ?
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
next        | prisma:query COMMIT
platform    | 2023-07-18 02:24:47.253 | INFO     | logging:callHandlers:1706 - message='OpenAI API response' path=**https://azure_base_url/v1/chat/completions** processing_ms=12.7857 request_id=546fe5b2-937b-42c1-abd7-1e82bbe9288c response_code=400
platform    | 2023-07-18 02:24:47.253 | INFO     | logging:callHandlers:1706 - error_code=None error_message='Unrecognized request argument supplied: functions' error_param=None error_type=invalid_request_error message='OpenAI API error received' stream_error=False
platform    | 2023-07-18 02:24:47.273 | ERROR    | reworkd_platform.web.api.error_handling:platformatic_exception_handler:13 - Unrecognized request argument supplied: functions
platform    | Traceback (most recent call last):
platform    | 
platform    |   File "/app/src/reworkd_platform/web/api/agent/helpers.py", line 32, in openai_error_handler
platform    |     return await func(*args, **kwargs)
platform    |                  │     │       └ {'messages': [HumanMessage(content='\n    High level objective: "Plan a detailed trip to Hawaii."\n    Current task:
jasangill1 commented 1 year ago

Hello @VectorZhao where may you be located ?? would you happen to have network restrictions??

VectorZhao commented 1 year ago

I am in China, however VPN has been deployed. You can see my screenshot, I am able to chat with GPT normally.

VectorZhao commented 1 year ago

I redeployed Oracle in Singapore and encountered the same error again.

jasangill1 commented 1 year ago

Hello @VectorZhao the vpn is causing issues as there are to many "handshakes" between networks happening

VectorZhao commented 1 year ago

I redeployed Oracle in Singapore and encountered the same error again.

jasangill1 commented 1 year ago

@VectorZhao is there any way you could try this locally?