Closed JeremyBickel closed 11 months ago
Could you paste your config.yaml without key?
Note: I've tried it with Open LLM as well as OpenAI.
WORKSPACE_PATH: "/media/jeremy/2TB/outgoing" OPENAI_API_BASE: "http://0.0.0.0:5000/v1" OPENAI_PROXY: "http://127.0.0.1:5000/v1" OPENAI_API_KEY: "sk-YOUR_API_KEY" # set the value to sk-xxx if you host the openai interface for open llm model OPENAI_API_MODEL: "default" MAX_TOKENS: 4096 RPM: 10
SEARCH_ENGINE: ddg WEB_BROWSER_ENGINE: playwright PLAYWRIGHT_BROWSER_TYPE: chromium PROMPT_FORMAT: markdown #json or markdown
@JeremyBickel you can pull the newest code to have a try.
@better629 'git pull' and 'pip install -e .' yielded this problem, so I deleted my environment and reinstalled.
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. litellm 0.1.824 requires openai<0.29.0,>=0.27.0, but you have openai 1.6.0 which is incompatible. open-interpreter 0.1.7 requires openai<0.29.0,>=0.28.0, but you have openai 1.6.0 which is incompatible. open-interpreter 0.1.7 requires tiktoken<0.5.0,>=0.4.0, but you have tiktoken 0.5.2 which is incompatible.
Then I copied my key.yaml back, commented out the openai stuff and uncommented the open_llm stuff.
Then:
metagpt "Write a cli snake game" 2023-12-21 04:00:33.707 | INFO | metagpt.const:get_metagpt_package_root:32 - Package root set to /media/jeremy/2TB/metagpt 2023-12-21 04:00:34.537 | INFO | metagpt.team:invest:84 - Investment: $3.0. 2023-12-21 04:00:34.538 | INFO | metagpt.roles.role:_act:379 - Alice(Product Manager): ready to PrepareDocuments 2023-12-21 04:00:34.552 | INFO | metagpt.utils.file_repository:save:60 - save to: /media/jeremy/2TB/outgoing/20231221040034/docs/requirement.txt 2023-12-21 04:00:34.554 | INFO | metagpt.roles.role:_act:379 - Alice(Product Manager): ready to WritePRD 2023-12-21 04:00:34.556 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 0.000(s), this was the 1st time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:34.759 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 0.203(s), this was the 2nd time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:35.927 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 1.372(s), this was the 3rd time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:38.238 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 3.683(s), this was the 4th time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:44.135 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 9.580(s), this was the 5th time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:50.845 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 16.290(s), this was the 6th time calling it. exp: 'OpenLLMGPTAPI' object has no attribute 'async_client' 2023-12-21 04:00:50.846 | WARNING | metagpt.utils.common:wrapper:505 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory. 2023-12-21 04:00:50.857 | ERROR | metagpt.utils.common:wrapper:487 - Exception occurs, start to serialize the project, exp: Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 256, in _aask_v1 content = await self.llm.aask(prompt, system_msgs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'OpenLLMGPTAPI' object has no attribute 'async_client'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 496, in wrapper return await func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 528, in run rsp = await self.react() ^^^^^^^^^^^^^^^^^^ tenacity.RetryError: RetryError[<Future at 0x7ff131b61850 state=finished raised AttributeError>]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 482, in wrapper result = await func(self, *args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/team.py", line 124, in run await self.env.run() Exception: Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 256, in _aask_v1 content = await self.llm.aask(prompt, system_msgs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/base_gpt_api.py", line 53, in aask rsp = await self.acompletion_text(message, stream=stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped return await fn(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/init.py", line 314, in iter return fut.result() ^^^^^^^^^^^^ File "/usr/lib/python3.11/concurrent/futures/_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/concurrent/futures/_base.py", line 401, in get_result raise self._exception File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/openai_api.py", line 274, in acompletion_text return await self._achat_completion_stream(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/openai_api.py", line 211, in _achat_completion_stream response: AsyncStream[ChatCompletionChunk] = await self.async_client.chat.completions.create( ^^^^^^^^^^^^^^^^^ AttributeError: 'OpenLLMGPTAPI' object has no attribute 'async_client'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 496, in wrapper return await func(self, *args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 528, in run rsp = await self.react() ^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 479, in react rsp = await self._react() ^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 459, in _react rsp = await self._act() # 这个rsp是否需要publish_message? ^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 380, in _act response = await self._rc.todo.run(self._rc.important_memory) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 105, in run prd_doc = await self._update_prd( ^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 146, in _update_prd prd = await self._run_new_requirement( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 126, in _run_new_requirement node = await WRITE_PRD_NODE.fill(context=context, llm=self.llm, schema=schema) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 314, in fill return await self.simple_fill(schema, mode) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 288, in simple_fill content, scontent = await self._aask_v1(prompt, class_name, mapping, schema=schema) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped return await fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/init.py", line 326, in iter raise retry_exc from fut.exception() tenacity.RetryError: RetryError[<Future at 0x7ff131b61850 state=finished raised AttributeError>]
@better629 I switched back from open_llm to openai (still pointing at my ooba server), and got this:
metagpt "Write a cli snake game" 2023-12-21 04:12:56.577 | INFO | metagpt.const:get_metagpt_package_root:32 - Package root set to /media/jeremy/2TB/metagpt 2023-12-21 04:12:57.036 | INFO | metagpt.config:get_default_llm_provider_enum:88 - OpenAI API Model: default 2023-12-21 04:12:57.792 | INFO | metagpt.team:invest:84 - Investment: $3.0. 2023-12-21 04:12:57.794 | INFO | metagpt.roles.role:_act:379 - Alice(Product Manager): ready to PrepareDocuments 2023-12-21 04:12:57.819 | INFO | metagpt.utils.file_repository:save:60 - save to: /media/jeremy/2TB/outgoing/20231221041257/docs/requirement.txt 2023-12-21 04:12:57.821 | INFO | metagpt.roles.role:_act:379 - Alice(Product Manager): ready to WritePRD 2023-12-21 04:13:00.604 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.781(s), this was the 1st time calling it. 2023-12-21 04:13:03.700 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.877(s), this was the 2nd time calling it. 2023-12-21 04:13:06.812 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 8.989(s), this was the 3rd time calling it. 2023-12-21 04:13:12.601 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 14.778(s), this was the 4th time calling it. 2023-12-21 04:13:19.604 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 21.781(s), this was the 5th time calling it. 2023-12-21 04:13:24.594 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 26.771(s), this was the 6th time calling it. 2023-12-21 04:13:24.595 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:13:24.595 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:13:24.595 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 26.772(s), this was the 1st time calling it. exp: Connection error. 2023-12-21 04:13:28.096 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.686(s), this was the 1st time calling it. 2023-12-21 04:13:30.868 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.458(s), this was the 2nd time calling it. 2023-12-21 04:13:34.929 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 9.519(s), this was the 3rd time calling it. 2023-12-21 04:13:41.511 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 16.100(s), this was the 4th time calling it. 2023-12-21 04:13:47.259 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 21.848(s), this was the 5th time calling it. 2023-12-21 04:13:52.240 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 26.829(s), this was the 6th time calling it. 2023-12-21 04:13:52.240 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:13:52.240 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:13:52.241 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 54.418(s), this was the 2nd time calling it. exp: Connection error. 2023-12-21 04:13:56.263 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.832(s), this was the 1st time calling it. 2023-12-21 04:13:59.339 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.908(s), this was the 2nd time calling it. 2023-12-21 04:14:02.860 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 9.429(s), this was the 3rd time calling it. 2023-12-21 04:14:06.312 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 12.882(s), this was the 4th time calling it. 2023-12-21 04:14:11.685 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 18.254(s), this was the 5th time calling it. 2023-12-21 04:14:20.093 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 26.662(s), this was the 6th time calling it. 2023-12-21 04:14:20.093 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:14:20.093 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:14:20.094 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 82.271(s), this was the 3rd time calling it. exp: Connection error. 2023-12-21 04:14:24.568 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.744(s), this was the 1st time calling it. 2023-12-21 04:14:27.700 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.876(s), this was the 2nd time calling it. 2023-12-21 04:14:31.160 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 9.337(s), this was the 3rd time calling it. 2023-12-21 04:14:34.821 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 12.997(s), this was the 4th time calling it. 2023-12-21 04:14:39.941 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 18.117(s), this was the 5th time calling it. 2023-12-21 04:14:44.619 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 22.796(s), this was the 6th time calling it. 2023-12-21 04:14:44.620 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:14:44.620 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:14:44.620 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 106.798(s), this was the 4th time calling it. exp: Connection error. 2023-12-21 04:14:52.212 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.768(s), this was the 1st time calling it. 2023-12-21 04:14:54.923 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.478(s), this was the 2nd time calling it. 2023-12-21 04:14:58.565 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 9.121(s), this was the 3rd time calling it. 2023-12-21 04:15:04.205 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 14.761(s), this was the 4th time calling it. 2023-12-21 04:15:09.723 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 20.278(s), this was the 5th time calling it. 2023-12-21 04:15:25.001 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 35.557(s), this was the 6th time calling it. 2023-12-21 04:15:25.002 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:15:25.002 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:15:25.003 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 147.180(s), this was the 5th time calling it. exp: Connection error. 2023-12-21 04:15:38.359 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 2.755(s), this was the 1st time calling it. 2023-12-21 04:15:41.150 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 5.547(s), this was the 2nd time calling it. 2023-12-21 04:15:44.005 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 8.402(s), this was the 3rd time calling it. 2023-12-21 04:15:46.940 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 11.336(s), this was the 4th time calling it. 2023-12-21 04:15:57.572 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 21.968(s), this was the 5th time calling it. 2023-12-21 04:16:12.520 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.openai_api.OpenAIGPTAPI.acompletion_text' after 36.916(s), this was the 6th time calling it. 2023-12-21 04:16:12.520 | ERROR | metagpt.provider.openai_api:log_and_reraise:142 - Retry attempts exhausted. Last exception: Connection error. 2023-12-21 04:16:12.520 | WARNING | metagpt.provider.openai_api:log_and_reraise:143 - Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ See FAQ 5.8
2023-12-21 04:16:12.520 | ERROR | metagpt.utils.common:log_it:433 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 194.698(s), this was the 6th time calling it. exp: Connection error. 2023-12-21 04:16:12.520 | WARNING | metagpt.utils.common:wrapper:505 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory. 2023-12-21 04:16:12.533 | ERROR | metagpt.utils.common:wrapper:487 - Exception occurs, start to serialize the project, exp: Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpx.ProxyError: 404 Not Found
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpx.ProxyError: 404 Not Found
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ httpx.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 256, in _aask_v1 content = await self.llm.aask(prompt, system_msgs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ openai.APIConnectionError: Connection error.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 496, in wrapper return await func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 528, in run rsp = await self.react() ^^^^^^^^^^^^^^^^^^ tenacity.RetryError: RetryError[<Future at 0x7f562250a290 state=finished raised APIConnectionError>]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 482, in wrapper result = await func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/team.py", line 124, in run await self.env.run() Exception: Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 268, in handle_async_request raise exc File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 251, in handle_async_request response = await connection.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/http_proxy.py", line 298, in handle_async_request raise ProxyError(msg) httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1674, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1711, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1748, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 370, in handle_async_request with map_httpcore_exceptions(): File "/usr/lib/python3.11/contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 84, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ProxyError: 404 Not Found
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 268, in handle_async_request raise exc File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 251, in handle_async_request response = await connection.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/http_proxy.py", line 298, in handle_async_request raise ProxyError(msg) httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1674, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1711, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1748, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 370, in handle_async_request with map_httpcore_exceptions(): File "/usr/lib/python3.11/contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 84, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ProxyError: 404 Not Found
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 371, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 268, in handle_async_request raise exc File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 251, in handle_async_request response = await connection.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpcore/_async/http_proxy.py", line 298, in handle_async_request raise ProxyError(msg) httpcore.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request response = await self._client.send( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1646, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1674, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1711, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_client.py", line 1748, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 370, in handle_async_request with map_httpcore_exceptions(): File "/usr/lib/python3.11/contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/httpx/_transports/default.py", line 84, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ProxyError: 404 Not Found
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 256, in _aask_v1 content = await self.llm.aask(prompt, system_msgs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/base_gpt_api.py", line 53, in aask rsp = await self.acompletion_text(message, stream=stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped return await fn(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/init.py", line 322, in iter return self.retry_error_callback(retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/openai_api.py", line 149, in log_and_reraise raise retry_state.outcome.exception() File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in call result = await fn(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/openai_api.py", line 274, in acompletion_text return await self._achat_completion_stream(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/provider/openai_api.py", line 211, in _achat_completion_stream response: AsyncStream[ChatCompletionChunk] = await self.async_client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1295, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1536, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1315, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1358, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1418, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1358, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1418, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/openai/_base_client.py", line 1367, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/media/jeremy/2TB/metagpt/metagpt/utils/common.py", line 496, in wrapper return await func(self, *args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 528, in run rsp = await self.react() ^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 479, in react rsp = await self._react() ^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 459, in _react rsp = await self._act() # 这个rsp是否需要publish_message? ^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/roles/role.py", line 380, in _act response = await self._rc.todo.run(self._rc.important_memory) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 105, in run prd_doc = await self._update_prd( ^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 146, in _update_prd prd = await self._run_new_requirement( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/write_prd.py", line 126, in _run_new_requirement node = await WRITE_PRD_NODE.fill(context=context, llm=self.llm, schema=schema) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 314, in fill return await self.simple_fill(schema, mode) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/media/jeremy/2TB/metagpt/metagpt/actions/action_node.py", line 288, in simple_fill content, scontent = await self._aask_v1(prompt, class_name, mapping, schema=schema) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped return await fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jeremy/python_envs/metagpt/lib/python3.11/site-packages/tenacity/init.py", line 326, in iter raise retry_exc from fut.exception() tenacity.RetryError: RetryError[<Future at 0x7f562250a290 state=finished raised APIConnectionError>]
@JeremyBickel
we update to support openai 1.x. Do you replace OPENAI_API_BASE
with OPENAI_BASE_URL
in config.yaml
.
I switched back from open_llm to openai (still pointing at my ooba server), and got this:
metagpt "Write a cli snake game" 2023-12-21 04:12:56.577 | INFO | metagpt.const:get_metagpt_package_root:32 - Package root set to /media/jeremy/2TB/metagpt 2023-12-21 04:12:57.036 | INFO | metagpt.config:get_default_llm_provider_enum:88 - OpenAI API Model: default # here
seems that you still using openai api?
@better629 To be clear, after you told me to try the new code, I did a pull but got pip dependency errors. So then I did a completely fresh clone, deleted and reinstalled my environment. I copied my old key.yaml (directly from this thread) but I switched from openai to open_llm at that time and tested it. You can see that outcome a few messages ago, that starts like:
@better629 'git pull' and 'pip install -e .' yielded this problem, so I deleted my environment and reinstalled.
After that, I switched from open_llm to openai, still pointing at the local server (I don't point at openai.com in any of this), and I got the much longer error response which your last message referred to.
I tried various things before making this bug report, mainly with print statements and trying to track it down and then trying to hotwire the Enum into the provider code, because it kept saying that the key 'open_llm' was invalid there (that's the error in the first message here), but I just couldn't track it down. My best understanding then was that it wasn't being registered as a provider for whatever reason.
After your latest fix, that error has gone away, but it's still getting "'OpenLLMGPTAPI' object has no attribute 'async_client'", which seems related. This code confuses me - my mind just won't track with it - so take it with a bit of salt when I say that however you made it see and register OpenLLMGPTAPI didn't go far enough or didn't quite register the completely initialized object ..or something?
@better629 The reason I tried using OpenAI at all is because it seemed that OpenAI was the only provider getting registered. So I pointed that provider to my local server, and it didn't work. I see now that talking here so much about OpenAI might be muddying the waters. I am using Open_LLM and getting these errors.
@better629 I got past the error. It had to do with the location of the key.yaml. I was leaving it in the config directory. When I recognized my mistake and put it in a .metagpt directory in my home dir, it still didn't work. But with a little tracing, I found that metagpt/config.py was looking for a config file in 3 places, and the one pointing to my home directory was ~/.metagpt/config.yaml. When I changed it to ~/.metagpt/key.yaml, it got past the error.
@JeremyBickel The current main branch has supported "GeminiGPTAPI", "OpenLLMGPTAPI", "OpenAIGPTAPI", "ZhiPuAIGPTAPI", "OllamaGPTAPI"
. Usually, I edit the config/key.yaml
, and pay attention to that due to update to new openai version, the OPENAI_API_BASE
has been changed into OPENAI_BASE_URL
.
If there are any problem, please comment.
Thank you, @better629 I've tried several times to get this to work, and I just can't seem to do it. I really don't know why. It looks like openllm provider is using openai without providing chat completion functions. I tried to hack in these functions from openai, but it was too extensive for me to handle. It just now occurs to me now that "Open_LLM" might not mean any local server using open source llm. I've been trying to use oobabooga's text-generation-webui with it. Does "open_llm" mean https://github.com/bentoml/OpenLLM? If so, then we've been talking at cross purposes, because I don't even run that :^D
@JeremyBickel open_llm means self-hosted models by your self which can ref to https://docs.deepwisdom.ai/main/en/guide/tutorials/integration_with_open_llm.html#model-deployment
. So which inference repo do you used currently.
@better629 I see. I'm not using any of those, but maybe I can get one going if it's required. The one I'm using now is ooba. It exposes a local API that acts like OpenAI. That's why I tried the OpenAI interface with the base pointing at locahost where the ooba API is served.
I don't know if you celebrate Christmas. I don't celebrate it very much, except the Master's birth is the most tremendous event in human history (apart from His later saving work on the cross, and then 3 days later..). Since it's Christmastime, whether He knows you or not, be blessed by knowledge of God's Christ, Savior of the World, the only Super-Man, my Master Jesus. :^D Whew! Makes me smile just to mention Him this little bit.
2023-12-20 13:26:10.211 | INFO | metagpt.const:get_metagpt_package_root:32 - Package root set to /media/jeremy/2TB/MetaGPT 2023-12-20 13:26:10.264 | WARNING | metagpt.config:_update:132 - LONG_TERM_MEMORY is True 2023-12-20 13:26:10.500 | INFO | metagpt.config:get_default_llm_provider_enum:81 - OpenAI API Model: gpt-4-1106-preview ╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /media/jeremy/2TB/MetaGPT/metagpt/startup.py:50 in startup │ │ │ │ 47 │ company = Team() │ │ 48 │ company.hire( │ │ 49 │ │ [ │ │ ❱ 50 │ │ │ ProductManager(), │ │ 51 │ │ │ Architect(), │ │ 52 │ │ │ ProjectManager(), │ │ 53 │ │ ] │ │ │ │ ╭───────────────────────────────────── locals ─────────────────────────────────────╮ │ │ │ Architect = <class 'metagpt.roles.architect.Architect'> │ │ │ │ code_review = True │ │ │ │ company = Team( │ │ │ │ │ env=Environment( │ │ │ │ │ │ roles={}, │ │ │ │ │ │ members={}, │ │ │ │ │ │ history='' │ │ │ │ │ ), │ │ │ │ │ investment=10.0, │ │ │ │ │ idea='' │ │ │ │ ) │ │ │ │ Engineer = <class 'metagpt.roles.engineer.Engineer'> │ │ │ │ idea = 'Write a cli snake game' │ │ │ │ implement = True │ │ │ │ inc = False │ │ │ │ investment = 3.0 │ │ │ │ max_auto_summarize_code = 0 │ │ │ │ n_round = 5 │ │ │ │ ProductManager = <class 'metagpt.roles.product_manager.ProductManager'> │ │ │ │ project_name = '' │ │ │ │ project_path = '' │ │ │ │ ProjectManager = <class 'metagpt.roles.project_manager.ProjectManager'> │ │ │ │ QaEngineer = <class 'metagpt.roles.qa_engineer.QaEngineer'> │ │ │ │ reqa_file = '' │ │ │ │ run_tests = False │ │ │ │ Team = <class 'metagpt.team.Team'> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /media/jeremy/2TB/MetaGPT/metagpt/roles/product_manager.py:43 in init │ │ │ │ 40 │ │ │ goal (str): Goal of the product manager. │ │ 41 │ │ │ constraints (str): Constraints or limitations for the product manager. │ │ 42 │ │ """ │ │ ❱ 43 │ │ super().init(name, profile, goal, constraints) │ │ 44 │ │ │ │ 45 │ │ self._init_actions([PrepareDocuments, WritePRD]) │ │ 46 │ │ self._watch([UserRequirement, PrepareDocuments]) │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ constraints = 'utilize the same language as the user requirements for seamless │ │ │ │ communication' │ │ │ │ goal = 'efficiently create a successful product that meets market demands and user │ │ │ │ expec'+7 │ │ │ │ name = 'Alice' │ │ │ │ profile = 'Product Manager' │ │ │ │ self = <metagpt.roles.product_manager.ProductManager object at 0x7f65a11b9cd0> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /media/jeremy/2TB/MetaGPT/metagpt/roles/role.py:133 in init │ │ │ │ 130 │ """Role/Agent""" │ │ 131 │ │ │ 132 │ def init(self, name="", profile="", goal="", constraints="", desc="", is_human=F │ │ ❱ 133 │ │ self._llm = LLM() if not is_human else HumanProvider() │ │ 134 │ │ self._setting = RoleSetting( │ │ 135 │ │ │ name=name, profile=profile, goal=goal, constraints=constraints, desc=desc, i │ │ 136 │ │ ) │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ constraints = 'utilize the same language as the user requirements for seamless │ │ │ │ communication' │ │ │ │ desc = '' │ │ │ │ goal = 'efficiently create a successful product that meets market demands and user │ │ │ │ expec'+7 │ │ │ │ is_human = False │ │ │ │ name = 'Alice' │ │ │ │ profile = 'Product Manager' │ │ │ │ self = <metagpt.roles.product_manager.ProductManager object at 0x7f65a11b9cd0> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /media/jeremy/2TB/MetaGPT/metagpt/llm.py:19 in LLM │ │ │ │ 16 │ │ 17 def LLM(provider: LLMProviderEnum = CONFIG.get_default_llm_provider_enum()) -> BaseGPTAP │ │ 18 │ """get the default llm provider""" │ │ ❱ 19 │ return LLM_REGISTRY.get_provider(provider) │ │ 20 │ │ │ │ ╭───────────────────── locals ──────────────────────╮ │ │ │ provider = <LLMProviderEnum.OPEN_LLM: 'open_llm'> │ │ │ ╰───────────────────────────────────────────────────╯ │ │ │ │ /media/jeremy/2TB/MetaGPT/metagpt/provider/llm_provider_registry.py:20 in get_provider │ │ │ │ 17 │ │ │ 18 │ def get_provider(self, enum: LLMProviderEnum): │ │ 19 │ │ """get provider instance according to the enum""" │ │ ❱ 20 │ │ return self.providers[enum]() │ │ 21 │ │ 22 │ │ 23 # Registry instance │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ enum = <LLMProviderEnum.OPEN_LLM: 'open_llm'> │ │ │ │ self = <metagpt.provider.llm_provider_registry.LLMProviderRegistry object at 0x7f658ac22a10> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ KeyError: <LLMProviderEnum.OPEN_LLM: 'open_llm'>