geekan / MetaGPT

🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
https://deepwisdom.ai/
MIT License
43.58k stars 5.19k forks source link

unsupported operand type(s) for +: 'generator' and 'list' when running researcher case #737

Open tappat225 opened 8 months ago

tappat225 commented 8 months ago

Bug description

TypeError: unsupported operand type(s) for +: 'generator' and 'list' occured when I tried to run the researcher case. I thought it was something wrong with my environment, but I tried it in the docker container again and still got the same error.

Environment information

Snippets from the key.yaml

OPENAI_API_MODEL: "gpt-3.5-turbo-16k"
MAX_TOKENS: 4096
RPM: 10
TIMEOUT: 60

SEARCH_ENGINE: serpapi
WEB_BROWSER_ENGINE: playwright

### for Research
MODEL_FOR_RESEARCHER_SUMMARY: gpt-3.5-turbo
MODEL_FOR_RESEARCHER_REPORT: gpt-3.5-turbo-16k

logs

Traceback (most recent call last):
  File "/app/metagpt/metagpt/utils/common.py", line 497, in wrapper
    return await func(self, *args, **kwargs)
  File "/app/metagpt/metagpt/roles/role.py", line 482, in run
    rsp = await self.react()
  File "/app/metagpt/metagpt/roles/researcher.py", line 105, in react
    msg = await super().react()
  File "/app/metagpt/metagpt/roles/role.py", line 452, in react
    rsp = await self._act_by_order()
  File "/app/metagpt/metagpt/roles/role.py", line 439, in _act_by_order
    rsp = await self._act()
  File "/app/metagpt/metagpt/roles/researcher.py", line 74, in _act
    summaries = await asyncio.gather(*todos)
  File "/app/metagpt/metagpt/actions/research.py", line 223, in run
    for prompt in generate_prompt_chunk(
  File "/app/metagpt/metagpt/utils/text.py", line 68, in generate_prompt_chunk
    paragraphs = split_paragraph(paragraph) + paragraphs
TypeError: unsupported operand type(s) for +: 'generator' and 'list'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/app/metagpt/metagpt/roles/researcher.py", line 126, in <module>
    fire.Fire(main)
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 466, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 679, in _CallAndUpdateTrace
    component = loop.run_until_complete(fn(*varargs, **kwargs))
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
    return future.result()
  File "/app/metagpt/metagpt/roles/researcher.py", line 124, in main
    await role.run(topic)
  File "/app/metagpt/metagpt/utils/common.py", line 513, in wrapper
    raise Exception(format_trackback_info(limit=None))
Exception: Traceback (most recent call last):
  File "/app/metagpt/metagpt/utils/common.py", line 497, in wrapper
    return await func(self, *args, **kwargs)
  File "/app/metagpt/metagpt/roles/role.py", line 482, in run
    rsp = await self.react()
  File "/app/metagpt/metagpt/roles/researcher.py", line 105, in react
    msg = await super().react()
  File "/app/metagpt/metagpt/roles/role.py", line 452, in react
    rsp = await self._act_by_order()
  File "/app/metagpt/metagpt/roles/role.py", line 439, in _act_by_order
    rsp = await self._act()
  File "/app/metagpt/metagpt/roles/researcher.py", line 74, in _act
    summaries = await asyncio.gather(*todos)
  File "/app/metagpt/metagpt/actions/research.py", line 223, in run
    for prompt in generate_prompt_chunk(
  File "/app/metagpt/metagpt/utils/text.py", line 68, in generate_prompt_chunk
    paragraphs = split_paragraph(paragraph) + paragraphs
TypeError: unsupported operand type(s) for +: 'generator' and 'list'

["dataiku", "datarobot"]["dataiku vs datarobot", "comparison between dataiku and datarobot", "features of dataiku and datarobot", "pros and cons of dataiku and datarobot"][1, 2, 3, 0, 4][1, 2, 3, 0, 4][0, 1, 2, 4, 5, 6][0, 2, 3, 5, 6]
tappat225 commented 8 months ago

Solved. It turns out that there is no value check for the max_token which is calculated from the code max_token = TOKEN_MAX.get(model_name, 2048) - reserved - 100. As I set MAX_TOKENS: 4096 in key.yaml to 4096, it leads to the calculation result to be less than 0.

Solution Set MAX_TOKENS in key.yaml less than your model's max token. In this case, I reset MAX_TOKENS to 1024 which solved my issue.

Maybe there should be a clear warning for this situation. 这种情况应该要加个警告或者自动调整参数

geekan commented 5 months ago

@shenchucheng Can you confirm if this issue has been resolved?

shenchucheng commented 5 months ago

@shenchucheng Can you confirm if this issue has been resolved?

@geekan Fixxed by https://github.com/geekan/MetaGPT/pull/867.