Open BlizeGit opened 3 weeks ago
星火的API报错了,应该是配置有问题,你可以把除了api_key/api_secret以为实际用的配置发下
这里的domain需要和你的url一一对应,你的url是星火3.5,因此domain需要是generalv3.5
"你的url是星火3.5,因此domain需要是generalv3.5", 这个正确,根据这个建议修改,且将search的ddg换成serpapi,现在是新的问题。
在将配置修改为:
现在的问题是:
2024-06-07 15:55:09.709 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /root/MetaGPT
2024-06-07 15:55:13.695 | INFO | main:_act:56 - David(Researcher): to do CollectLinks(David)
2024-06-07 15:55:15.052 | WARNING | metagpt.utils.cost_manager:update_cost:49 - Model generalv3.5 not found in TOKEN_COSTS.
2024-06-07 15:55:16.386 | WARNING | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
Traceback (most recent call last):
File "/root/MetaGPT/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
File "/root/MetaGPT/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
File "/root/MetaGPT/metagpt/roles/researcher.py", line 105, in react
msg = await super().react()
File "/root/MetaGPT/metagpt/roles/role.py", line 519, in react
rsp = await self._react()
File "/root/MetaGPT/metagpt/roles/role.py", line 474, in _react
rsp = await self._act()
File "/root/MetaGPT/metagpt/roles/researcher.py", line 67, in _act
links = await todo.run(topic, 4, 4)
File "/root/MetaGPT/metagpt/actions/research.py", line 137, in run
prompt = reduce_message_length(gen_msg(), model_name, system_text, config.llm.max_token)
File "/root/MetaGPT/metagpt/utils/text.py", line 26, in reduce_message_length
max_token = TOKEN_MAX.get(model_name, 2048) - count_output_tokens(system_text, model_name) - reserved
File "/root/MetaGPT/metagpt/utils/token_counter.py", line 366, in count_output_tokens
encoding = tiktoken.encoding_for_model(model)
File "/usr/local/lib/python3.10/dist-packages/tiktoken/model.py", line 103, in encoding_for_model
return get_encoding(encoding_name_for_model(model_name))
File "/usr/local/lib/python3.10/dist-packages/tiktoken/model.py", line 86, in encoding_name_for_model
if model_name.startswith(model_prefix):
AttributeError: 'NoneType' object has no attribute 'startswith'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/root/MetaGPT/metagpt/roles/researcher.py", line 126, in
["tensorflow", "pytorch"]
现在换成了openai,出现问题如下
Bug description
platform root@qingyi:~/MetaGPT/config# lsb_release -a LSB Version: core-11.1.0ubuntu4-noarch:security-11.1.0ubuntu4-noarch Distributor ID: Ubuntu Description: Ubuntu 22.04.4 LTS Release: 22.04 Codename: jammy
root@qingyi:~/MetaGPT/config# python3 --version Python 3.10.12
all components are installed successfully with the below command lines: 2.1 git clone https://github.com/geekan/MetaGPT.git cd /your/path/to/MetaGPT pip install -e .
2.2 pip install -e .[rag]
2.3 pip install playwright playwright install --with-deps chromium (but this package can't be found with command pip list|grep chromium)
2.4 pip install metagpt[search-google] pip install metagpt[search-ddg] pip install metagpt[playwright] pip install metagpt[selenium]
configuration root@qingyi:~/MetaGPT/config# cat config2.yaml
Full Example: https://github.com/geekan/MetaGPT/blob/main/config/config2.example.yaml
Reflected Code: https://github.com/geekan/MetaGPT/blob/main/metagpt/config2.py
Config Docs: https://docs.deepwisdom.ai/main/en/guide/get_started/configuration.html
llm: api_type: "spark"
对应模型的url 参考 https://www.xfyun.cn/doc/spark/Web.html#_1-%E6%8E%A5%E5%8F%A3%E8%AF%B4%E6%98%8E
base_url: "ws(s)://spark-api.xf-yun.com/v3.5/chat" app_id: "8888888" api_key: "8888888888" api_secret: "8888888888888" domain: "generalv2" # 取值为 [general,generalv2,generalv3,generalv3.5] 和url一一对应 search: engine: "ddg" browser: engine: "selenium" browser_type: "chrome"
Screenshots or logs root@qingyi:~/MetaGPT# python3 -m metagpt.roles.researcher "tensorflow vs. pytorch" 2024-06-06 17:08:48.182 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /root/MetaGPT 2024-06-06 17:08:55.695 | INFO | main:_act:56 - David(Researcher): to do CollectLinks(David) 2024-06-06 17:08:55.859 | WARNING | tenacity.after:log_it:44 - Finished call to 'metagpt.provider.base_llm.BaseLLM.acompletion_text' after 0.164(s), this was the 1st time calling it. 2024-06-06 17:08:56.472 | WARNING | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory. Traceback (most recent call last): File "/root/MetaGPT/metagpt/utils/common.py", line 640, in wrapper return await func(self, *args, kwargs) File "/root/MetaGPT/metagpt/roles/role.py", line 550, in run rsp = await self.react() File "/root/MetaGPT/metagpt/roles/researcher.py", line 105, in react msg = await super().react() File "/root/MetaGPT/metagpt/roles/role.py", line 519, in react rsp = await self._react() File "/root/MetaGPT/metagpt/roles/role.py", line 474, in _react rsp = await self._act() File "/root/MetaGPT/metagpt/roles/researcher.py", line 67, in _act links = await todo.run(topic, 4, 4) File "/root/MetaGPT/metagpt/actions/research.py", line 113, in run keywords = await self._aask(SEARCH_TOPIC_PROMPT, [system_text]) File "/root/MetaGPT/metagpt/actions/action.py", line 93, in _aask return await self.llm.aask(prompt, system_msgs) File "/root/MetaGPT/metagpt/provider/base_llm.py", line 152, in aask rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout)) File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 88, in async_wrapped return await fn(*args, *kwargs) File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 47, in call do = self.iter(retry_state=retry_state) File "/usr/local/lib/python3.10/dist-packages/tenacity/init.py", line 314, in iter return fut.result() File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.get_result() File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in get_result raise self._exception File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 50, in call result = await fn(args, kwargs) File "/root/MetaGPT/metagpt/provider/base_llm.py", line 202, in acompletion_text return await self._achat_completion_stream(messages, timeout=self.get_timeout(timeout)) File "/root/MetaGPT/metagpt/provider/spark_api.py", line 76, in _achat_completion_stream async for chunk in response: File "/usr/local/lib/python3.10/dist-packages/sparkai/core/language_models/chat_models.py", line 309, in astream raise e File "/usr/local/lib/python3.10/dist-packages/sparkai/core/language_models/chat_models.py", line 301, in astream assert generation is not None AssertionError
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/root/MetaGPT/metagpt/roles/researcher.py", line 126, in
fire.Fire(main)
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 679, in _CallAndUpdateTrace
component = loop.run_until_complete(fn(*varargs, kwargs))
File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/root/MetaGPT/metagpt/roles/researcher.py", line 124, in main
await role.run(topic)
File "/root/MetaGPT/metagpt/utils/common.py", line 662, in wrapper
raise Exception(format_trackback_info(limit=None))
Exception: Traceback (most recent call last):
File "/root/MetaGPT/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, *kwargs)
File "/root/MetaGPT/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
File "/root/MetaGPT/metagpt/roles/researcher.py", line 105, in react
msg = await super().react()
File "/root/MetaGPT/metagpt/roles/role.py", line 519, in react
rsp = await self._react()
File "/root/MetaGPT/metagpt/roles/role.py", line 474, in _react
rsp = await self._act()
File "/root/MetaGPT/metagpt/roles/researcher.py", line 67, in _act
links = await todo.run(topic, 4, 4)
File "/root/MetaGPT/metagpt/actions/research.py", line 113, in run
keywords = await self._aask(SEARCH_TOPIC_PROMPT, [system_text])
File "/root/MetaGPT/metagpt/actions/action.py", line 93, in _aask
return await self.llm.aask(prompt, system_msgs)
File "/root/MetaGPT/metagpt/provider/base_llm.py", line 152, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
File "/usr/local/lib/python3.10/dist-packages/tenacity/init.py", line 314, in iter
return fut.result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.get_result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in get_result
raise self._exception
File "/usr/local/lib/python3.10/dist-packages/tenacity/_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
File "/root/MetaGPT/metagpt/provider/base_llm.py", line 202, in acompletion_text
return await self._achat_completion_stream(messages, timeout=self.get_timeout(timeout))
File "/root/MetaGPT/metagpt/provider/spark_api.py", line 76, in _achat_completion_stream
async for chunk in response:
File "/usr/local/lib/python3.10/dist-packages/sparkai/core/language_models/chat_models.py", line 309, in astream
raise e
File "/usr/local/lib/python3.10/dist-packages/sparkai/core/language_models/chat_models.py", line 301, in astream
assert generation is not None
AssertionError