OpenBMB / XAgent

An Autonomous LLM Agent for Complex Task Solving
https://blog.x-agent.net/blog/xagent/
Apache License 2.0
8.06k stars 827 forks source link

websocket connet to GUI/CLI failed #82

Closed georgehu0815 closed 11 months ago

georgehu0815 commented 11 months ago

I am using Windows11 WSL ubuntu 22.04 Linux environment:

Reproduce use WSL: 1) config assets/config.yml with apikey 2) go to root folder, run docker-compose up 3) run docker exec XAgent-Server systemctl start nginx 4) go to webapp with http://localhost:5173/playground, then input "hi", UI failed connect to websocket in XAgent-Server..

run the request from cli command also failed: 1) python run.py --task "hi" --model "gpt-4" --config_file "assets/config.yml"

After docker-compose up, it says success: xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [7] [INFO] Started server process [7] xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [7] [INFO] Waiting for application startup. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [7] [INFO] Application startup complete. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [8] [INFO] Started server process [8] xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [8] [INFO] Waiting for application startup. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [8] [INFO] Application startup complete. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [9] [INFO] Started server process [9] xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [9] [INFO] Waiting for application startup. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [9] [INFO] Application startup complete. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [10] [INFO] Started server process [10] xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [10] [INFO] Waiting for application startup. xagent-ToolServerManager-1 | [2023-10-22 13:48:55 +0000] [10] [INFO] Application startup complete. XAgent-Server | INFO: Started server process [8] XAgent-Server | INFO: Waiting for application startup. XAgent-Server | INFO: Application startup complete. XAgent-Server | INFO: ('172.21.0.1', 60830) - "WebSocket /ws/45acd1fc-d29c-4fa5-8d16-5fcae23cc897?user_id=admin&token=xagent-admin&description=hi" [accepted] XAgent-Server | INFO: connection open XAgent-Server | XAgent Service Startup Param:

go to http://localhost:5173/playground, say hi, UI will not able to connect to xserver websocker socket:

XAgent-Server | INFO: Started server process [8] XAgent-Server | INFO: Waiting for application startup. XAgent-Server | INFO: Application startup complete. XAgent-Server | INFO: ('172.21.0.1', 60830) - "WebSocket /ws/45acd1fc-d29c-4fa5-8d16-5fcae23cc897?user_id=admin&token=xagent-admin&description=hi" [accepted] XAgent-Server | INFO: connection open XAgent-Server | XAgent Service Startup Param: XAgent-Server | app: app:app XAgent-Server | prod: False XAgent-Server | base_dir: XAgentServer XAgent-Server | use_redis: False XAgent-Server | recorder_root_dir: running_records XAgent-Server | default_login: True XAgent-Server | check_running: False XAgent-Server | host: 0.0.0.0 XAgent-Server | port: 8090 XAgent-Server | debug: True XAgent-Server | reload: True XAgent-Server | workers: 1 XAgent-Server | DB: <class 'XAgentServer.envs.XAgentServerEnv.DB'> XAgent-Server | Redis: <class 'XAgentServer.envs.XAgentServerEnv.Redis'> XAgent-Server | Email: <class 'XAgentServer.envs.XAgentServerEnv.Email'> XAgent-Server | Upload: <class 'XAgentServer.envs.XAgentServerEnv.Upload'> XAgent-Server | init websocket_queue XAgent-Server | XAgentServer is running on 0.0.0.0:8090
XAgent-Server | Default user: admin, token: xagent-admin, you can use it to login
XAgent-Server | Create task for pong broadcast XAgent-Server | init a websocket manager
XAgent-Server | init a thread pool executor, max_workers: 1
XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | init localstorage connection: users.json XAgent-Server | init localstorage connection: interaction.json XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | pong broadcast for active connections: 0 XAgent-Server | Receive connection from 45acd1fc-d29c-4fa5-8d16-5fcae23cc897: user_id: admin, token: xagent-admin, description: hi XAgent-Server | websocket 45acd1fc-d29c-4fa5-8d16-5fcae23cc897 connected XAgent-Server | Receive data from 45acd1fc-d29c-4fa5-8d16-5fcae23cc897: {"type":"data","args":{"goal":"hi"},"agent":"agent","mode":"auto","file_list":[]} XAgent-Server | Register parameter: {'interaction_id': '45acd1fc-d29c-4fa5-8d16-5fcae23cc897', 'parameter_id': '8f77bdada6d84ae5ae8afa63ff918490', 'args': {'goal': 'hi'}} into interaction of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | init interaction: 45acd1fc-d29c-4fa5-8d16-5fcae23cc897 XAgent-Server | Register logger into interaction of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | Register io into interaction of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | Register db into interaction of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | Register logger into XAgentServer of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | Start a new thread to run interaction of 45acd1fc-d29c-4fa5-8d16-5fcae23cc897, done! XAgent-Server | Constructing an AgentDispatcher: XAgentDispatcher XAgent-Server | server is running, the start query is hi XAgent-Server | { XAgent-Server | "openai_keys": { XAgent-Server | "gpt-3.5-turbo-16k": [ XAgent-Server | { XAgent-Server | "api_key": "", XAgent-Server | "organization": "azure", XAgent-Server | "model": "gpk", XAgent-Server | "engine": "gpt35turbo", XAgent-Server | "model_name": "gpt-35-turbo-16k", XAgent-Server | "api_base": "/", XAgent-Server | "api_type": "azure", XAgent-Server | "api_version": "2023-07-01-preview" XAgent-Server | } XAgent-Server | ], XAgent-Server | "gpt-4": [ XAgent-Server | { XAgent-Server | "api_key": "", XAgent-Server | "organization": "azure", XAgent-Server | "api_base": "ht/", XAgent-Server | "api_type": "azure", XAgent-Server | "api_version": "2023-preview", XAgent-Server | "model": "gpt-4", XAgent-Server | "engine": "gpt-4" XAgent-Server | } XAgent-Server | ] XAgent-Server | }, XAgent-Server | "default_completion_kwargs": { XAgent-Server | "model": "gpt-4", XAgent-Server | "temperature": 0.2, XAgent-Server | "request_timeout": 60 XAgent-Server | }, XAgent-Server | "enable_summary": true, XAgent-Server | "summary": { XAgent-Server | "single_action_max_length": 2048, XAgent-Server | "max_return_length": 12384 XAgent-Server | }, XAgent-Server | "use_selfhost_toolserver": true, XAgent-Server | "selfhost_toolserver_url": "http://0.0.0.0:8080", XAgent-Server | "max_retry_times": 3, XAgent-Server | "max_subtask_chain_length": 15, XAgent-Server | "max_plan_refine_chain_length": 3, XAgent-Server | "max_plan_tree_depth": 3, XAgent-Server | "max_plan_tree_width": 5, XAgent-Server | "max_plan_length": 8192, XAgent-Server | "rapidapi_retrieve_tool_count": 0, XAgent-Server | "enable_ask_human_for_help": false, XAgent-Server | "tool_blacklist": [ XAgent-Server | "FileSystemEnv_print_filesys_struture" XAgent-Server | ], XAgent-Server | "record_dir": null XAgent-Server | } XAgent-Server | Human-In-The-Loop False XAgent-Server | ERROR: Exception in ASGI application XAgent-Server | Traceback (most recent call last): XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 203, in _new_conn XAgent-Server | sock = connection.create_connection( XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection XAgent-Server | raise err XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 73, in create_connection XAgent-Server | sock.connect(sa) XAgent-Server | ConnectionRefusedError: [Errno 111] Connection refused XAgent-Server | XAgent-Server | The above exception was the direct cause of the following exception: XAgent-Server | XAgent-Server | Traceback (most recent call last): XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 791, in urlopen XAgent-Server | response = self._make_request( XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 497, in _make_request XAgent-Server | conn.request( XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 395, in request XAgent-Server | self.endheaders() XAgent-Server | File "/usr/local/lib/python3.10/http/client.py", line 1278, in endheaders XAgent-Server | self._send_output(message_body, encode_chunked=encode_chunked) XAgent-Server | File "/usr/local/lib/python3.10/http/client.py", line 1038, in _send_output XAgent-Server | self.send(msg) XAgent-Server | File "/usr/local/lib/python3.10/http/client.py", line 976, in send XAgent-Server | self.connect() XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 243, in connect XAgent-Server | self.sock = self._new_conn() XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 218, in _new_conn XAgent-Server | raise NewConnectionError( XAgent-Server | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7ff5501b4910>: Failed to establish a new connection: [Errno 111] Connection refused XAgent-Server | XAgent-Server | The above exception was the direct cause of the following exception: XAgent-Server | XAgent-Server | Traceback (most recent call last): XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 486, in send XAgent-Server | resp = conn.urlopen( XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 845, in urlopen XAgent-Server | retries = retries.increment( XAgent-Server | File "/usr/local/lib/python3.10/site-packages/urllib3/util/retry.py", line 515, in increment XAgent-Server | raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] XAgent-Server | urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='0.0.0.0', port=8080): Max retries exceeded with url: /get_cookie (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7ff5501b4910>: Failed to establish a new connection: [Errno 111] Connection refused')) XAgent-Server | XAgent-Server | During handling of the above exception, another exception occurred: XAgent-Server | XAgent-Server | Traceback (most recent call last): XAgent-Server | File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 247, in run_asgi XAgent-Server | result = await self.app(self.scope, self.asgi_receive, self.asgi_send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call XAgent-Server | return await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 1115, in call XAgent-Server | await super().call(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 122, in call XAgent-Server | await self.middleware_stack(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 149, in call XAgent-Server | await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 75, in call XAgent-Server | await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/base.py", line 26, in call XAgent-Server | await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in call XAgent-Server | raise exc XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in call XAgent-Server | await self.app(scope, receive, sender) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in call XAgent-Server | raise e XAgent-Server | File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in call XAgent-Server | await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 718, in call XAgent-Server | await route.handle(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 341, in handle XAgent-Server | await self.app(scope, receive, send) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/endpoints.py", line 88, in dispatch XAgent-Server | raise exc XAgent-Server | File "/usr/local/lib/python3.10/site-packages/starlette/endpoints.py", line 80, in dispatch XAgent-Server | await self.on_receive(websocket, data) XAgent-Server | File "/app/app.py", line 628, in on_receive XAgent-Server | await asyncio.create_task(self.do_running_long_task(parameter)) XAgent-Server | File "/app/app.py", line 688, in do_running_long_task XAgent-Server | await task XAgent-Server | File "/app/XAgentServer/server.py", line 64, in interact XAgent-Server | toolserver_interface.lazy_init(config=config) XAgent-Server | File "/app/XAgent/tool_call_handle.py", line 62, in lazy_init XAgent-Server | response = requests.post(f'{self.url}/get_cookie',) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/api.py", line 115, in post XAgent-Server | return request("post", url, data=data, json=json, kwargs) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/api.py", line 59, in request XAgent-Server | return session.request(method=method, url=url, kwargs) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/sessions.py", line 589, in request XAgent-Server | resp = self.send(prep, send_kwargs) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/sessions.py", line 703, in send XAgent-Server | r = adapter.send(request, kwargs) XAgent-Server | File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 519, in send XAgent-Server | raise ConnectionError(e, request=request) XAgent-Server | requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8080): Max retries exceeded with url: /get_cookie (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7ff5501b4910>: Failed to establish a new connection: [Errno 111] Connection refused')) XAgent-Server | INFO: connection closed image

shalimujiang commented 11 months ago

me too image

georgehu0815 commented 11 months ago

Human-In-The-Loop False ToolServer connected in http://localhost:8080 Start outer loop async -=-=-=-=-=-=-= BEGIN QUERY SOVLING -=-=-=-=-=-=-= Role Assistant Task hi -=-=-=-=-=-=-= GENERATE INITIAL_PLAN -=-=-=-=-=-=-= Constructing an Agent: PlanGenerateAgent chatcompletion: using gpt-4 Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 16, in chatcompletion_request response = openai.ChatCompletion.create(*chatcompletion_kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(args, **kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/obj_generator.py", line 44, in chatcompletion response = self._get_chatcompletion_request_func(request_type)(**kwargs) File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 21, in chatcompletion_request logger.info(e)

shalimujiang commented 11 months ago

Human-In-The-Loop False ToolServer connected in http://localhost:8080 Start outer loop async -=-=-=-=-=-=-= BEGIN QUERY SOVLING -=-=-=-=-=-=-= Role Assistant Task hi -=-=-=-=-=-=-= GENERATE INITIAL_PLAN -=-=-=-=-=-=-= Constructing an Agent: PlanGenerateAgent chatcompletion: using gpt-4 Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 16, in chatcompletion_request response = openai.ChatCompletion.create(*chatcompletion_kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(args, **kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/obj_generator.py", line 44, in chatcompletion response = self._get_chatcompletion_request_func(request_type)(**kwargs) File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 21, in chatcompletion_request logger.info(e)

Hello, could you please tell me how you fixed the "502 Server Error: Bad Gateway for url: http://0.0.0.0:8080/get_available_tools" error? Thank you.

luyaxi commented 11 months ago

It seems that you have wrong configured ToolServer url, please make sure the environment variable TOOLSERVER_URL in docker-compose.yml is http://ToolServerManager:8080.

georgehu0815 commented 11 months ago

Human-In-The-Loop False ToolServer connected in http://localhost:8080 Start outer loop async -=-=-=-=-=-=-= BEGIN QUERY SOVLING -=-=-=-=-=-=-= Role Assistant Task hi -=-=-=-=-=-=-= GENERATE INITIAL_PLAN -=-=-=-=-=-=-= Constructing an Agent: PlanGenerateAgent chatcompletion: using gpt-4 Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 16, in chatcompletion_request response = openai.ChatCompletion.create(*chatcompletion_kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(args, kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/obj_generator.py", line 44, in chatcompletion response = self._get_chatcompletion_request_func(request_type)(kwargs) File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 21, in chatcompletion_request logger.info(e)

Hello, could you please tell me how you fixed the "502 Server Error: Bad Gateway for url: http://0.0.0.0:8080/get_available_tools" error? Thank you.

I write my start shell file to start toolmanager locally, not form docker

gunicorn --workers=1 --worker-class=uvicorn.workers.UvicornWorker -b 0.0.0.0:8080 main:app

luyaxi commented 11 months ago

Please do not start Tool manager locally, we relay docker network to connect our component. You should start all services from docker.

shalimujiang commented 11 months ago

Human-In-The-Loop False ToolServer connected in http://localhost:8080 Start outer loop async -=-=-=-=-=-=-= BEGIN QUERY SOVLING -=-=-=-=-=-=-= Role Assistant Task hi -=-=-=-=-=-=-= GENERATE INITIAL_PLAN -=-=-=-=-=-=-= Constructing an Agent: PlanGenerateAgent chatcompletion: using gpt-4 Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 16, in chatcompletion_request response = openai.ChatCompletion.create(*chatcompletion_kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(args, kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/obj_generator.py", line 44, in chatcompletion response = self._get_chatcompletion_request_func(request_type)(kwargs) File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 21, in chatcompletion_request logger.info(e)

Hello, could you please tell me how you fixed the "502 Server Error: Bad Gateway for url: http://0.0.0.0:8080/get_available_tools" error? Thank you.

I write my start shell file to start toolmanager locally, not form docker

gunicorn --workers=1 --worker-class=uvicorn.workers.UvicornWorker -b 0.0.0.0:8080 main:app

thank you

shalimujiang commented 11 months ago

Human-In-The-Loop False ToolServer connected in http://localhost:8080 Start outer loop async -=-=-=-=-=-=-= BEGIN QUERY SOVLING -=-=-=-=-=-=-= Role Assistant Task hi -=-=-=-=-=-=-= GENERATE INITIAL_PLAN -=-=-=-=-=-=-= Constructing an Agent: PlanGenerateAgent chatcompletion: using gpt-4 Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 16, in chatcompletion_request response = openai.ChatCompletion.create(*chatcompletion_kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(args, kwargs) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/home/bochuxt/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/mnt/a/XAgent/XAgent/ai_functions/request/obj_generator.py", line 44, in chatcompletion response = self._get_chatcompletion_request_func(request_type)(kwargs) File "/mnt/a/XAgent/XAgent/ai_functions/request/openai.py", line 21, in chatcompletion_request logger.info(e)

Hello, could you please tell me how you fixed the "502 Server Error: Bad Gateway for url: http://0.0.0.0:8080/get_available_tools" error? Thank you.

I write my start shell file to start toolmanager locally, not form docker

gunicorn --workers=1 --worker-class=uvicorn.workers.UvicornWorker -b 0.0.0.0:8080 main:app

Hello, I also have a similar development environment like yours: Windows 10, WSL2, Ubuntu 20.04, and Miniconda. I haven't worked with this kind of development environment before. My previous issues were:

  1. I could access local localhost:8080 (with proxy for external access using Clash), but gp4 didn't respond to requests.
  2. I couldn't access gp4, so I modified the proxy: I added the system default in Ubuntu's .bashrc file and opened "Allow LAN" in Clash, entering the WSL address and port. However, the project kept returning a 502 status code.
  3. Finally, I managed to solve it by modifying the proxy again: I added export no_proxy=localhost,127.0.0.1,::1 after the previous proxy configuration. This way, I can access the local environment port and external websites.

Here are some reference links and code snippets:

Here's the code snippet you provided:

vim ~/.bashrc
export http_proxy=http://proxy_server:proxy_port
export https_proxy=http://proxy_server:proxy_port
export no_proxy=localhost,127.0.0.1,::1
source ~/.bashrc

I hope this helps! Let me know if you have any other questions.