geekan / MetaGPT

🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
https://deepwisdom.ai/
MIT License
41.48k stars 4.94k forks source link

AttributeError: 'NoneType' object has no attribute 'startswith' when running "Researcher" example with Gemini #1023

Open vmsysadm opened 3 months ago

vmsysadm commented 3 months ago

Bug description Running an example provided on the web site - https://docs.deepwisdom.ai/main/en/guide/use_cases/agent/researcher.html - fails when using "gemini" model:

Environment information System version - Windows 11 Python version - Python 3.10.9 LLM type and model - Gemini

Installation method: "development mode" git clone https://github.com/geekan/MetaGPT.git

config2.example.yaml: llm: api_type: 'gemini' api_key: 'XXXXXXXXXXXXXXXXXXXXXXXXXXX'

search: api_type: "google" api_key: "XXXXXXXXXXXXXXXXXXXXXXXXXXX" cse_id: "XXXXXXXXXXXXXX"

(metagpt) PS C:\Users\user\code\test\MetaGPT> python -m metagpt.roles.researcher "tensorflow vs. pytorch" 2024-03-17 18:28:13.795 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to C:\Users\user\code\test\MetaGPT 2024-03-17 18:28:23.681 | INFO | main:_act:56 - David(Researcher): to do CollectLinks(David) ["Machine learning", "Natural language processing"] 2024-03-17 18:28:24.807 | INFO | metagpt.utils.cost_manager:update_cost:57 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 37, completion_tokens: 10 2024-03-17 18:28:25.233 | WARNING | metagpt.utils.common:wrapper:647 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory. Traceback (most recent call last): File "C:\Users\user\code\test\MetaGPT\metagpt\utils\common.py", line 638, in wrapper return await func(self, *args, **kwargs) File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 548, in run rsp = await self.react() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 105, in react msg = await super().react() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 517, in react rsp = await self._act_by_order() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 471, in _act_by_order rsp = await self._act() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 67, in _act links = await todo.run(topic, 4, 4) File "C:\Users\user\code\test\MetaGPT\metagpt\actions\research.py", line 137, in run prompt = reduce_message_length(gen_msg(), model_name, system_text, config.llm.max_token) File "C:\Users\user\code\test\MetaGPT\metagpt\utils\text.py", line 26, in reduce_message_length max_token = TOKEN_MAX.get(model_name, 2048) - count_string_tokens(system_text, model_name) - reserved File "C:\Users\user\code\test\MetaGPT\metagpt\utils\token_counter.py", line 262, in count_string_tokens encoding = tiktoken.encoding_for_model(model_name) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\tiktoken\model.py", line 97, in encoding_for_model return get_encoding(encoding_name_for_model(model_name)) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\tiktoken\model.py", line 80, in encoding_name_for_model if model_name.startswith(model_prefix): AttributeError: 'NoneType' object has no attribute 'startswith'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 126, in fire.Fire(main) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\fire\core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\fire\core.py", line 466, in _Fire component, remaining_args = _CallAndUpdateTrace( File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\fire\core.py", line 679, in _CallAndUpdateTrace component = loop.run_until_complete(fn(*varargs, *kwargs)) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\asyncio\base_events.py", line 649, in run_until_complete return future.result() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 124, in main await role.run(topic) File "C:\Users\user\code\test\MetaGPT\metagpt\utils\common.py", line 660, in wrapper raise Exception(format_trackback_info(limit=None)) Exception: Traceback (most recent call last): File "C:\Users\user\code\test\MetaGPT\metagpt\utils\common.py", line 638, in wrapper return await func(self, args, **kwargs) File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 548, in run rsp = await self.react() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 105, in react msg = await super().react() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 517, in react rsp = await self._act_by_order() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\role.py", line 471, in _act_by_order rsp = await self._act() File "C:\Users\user\code\test\MetaGPT\metagpt\roles\researcher.py", line 67, in _act links = await todo.run(topic, 4, 4) File "C:\Users\user\code\test\MetaGPT\metagpt\actions\research.py", line 137, in run prompt = reduce_message_length(gen_msg(), model_name, system_text, config.llm.max_token) File "C:\Users\user\code\test\MetaGPT\metagpt\utils\text.py", line 26, in reduce_message_length max_token = TOKEN_MAX.get(model_name, 2048) - count_string_tokens(system_text, model_name) - reserved File "C:\Users\user\code\test\MetaGPT\metagpt\utils\token_counter.py", line 262, in count_string_tokens encoding = tiktoken.encoding_for_model(model_name) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\tiktoken\model.py", line 97, in encoding_for_model return get_encoding(encoding_name_for_model(model_name)) File "C:\Users\user\scoop\apps\miniconda3\current\envs\metagpt\lib\site-packages\tiktoken\model.py", line 80, in encoding_name_for_model if model_name.startswith(model_prefix): AttributeError: 'NoneType' object has no attribute 'startswith'

iorisa commented 3 months ago

This log indicates that model_name is None:

if model_name.startswith(model_prefix):
AttributeError: 'NoneType' object has no attribute 'startswith'

According to the source code:

        model_name = config.llm.model
        prompt = reduce_message_length(gen_msg(), model_name, system_text, config.llm.max_token)

The model_name should correspond to llm.model in the configuration:

llm:
  ...
  model: "gemini-pro"  # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
  ...
vmsysadm commented 3 months ago

Thank you for the information, that works! You may want to update your "Configuration" page https://docs.deepwisdom.ai/main/en/guide/get_started/configuration.html to mention that the Gemini model must be specified explicitly:

Google Gemini

supports default model gemini-pro llm: api_type: 'gemini' api_key: 'YOUR_API_KEY' model: 'MODEL_NAME' # gemini-pro

Unfortunately "Researcher" example still fails to complete correctly using Gemini, I will open a different issue.

geekan commented 3 months ago

Added to document. Will show up when it is redeployed