geekan / MetaGPT

🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
https://deepwisdom.ai/
MIT License
44.05k stars 5.24k forks source link

KeyError: 'Could not automatically map chatglm_turbo to a tokeniser. Please use `tiktok.get_encoding` to explicitly get the tokeniser you expect.' #558

Open QIN2DIM opened 9 months ago

QIN2DIM commented 9 months ago

运行 examples/research.py 时出现错误,

import asyncio

from metagpt.roles.researcher import RESEARCH_PATH, Researcher

async def main():
    topic = ("XXX")
    role = Researcher(language="zh-cn")
    await role.run(topic)

    print(f"save report to {RESEARCH_PATH / f'{topic}.md'}.")

if __name__ == '__main__':
    asyncio.run(main())
2023-12-14 15:27:28.767 | INFO     | metagpt.const:get_project_root:21 - PROJECT_ROOT set to D:\_GitHubProjects\Clones\MetaGPT
2023-12-14 15:27:28.848 | INFO     | metagpt.config:__init__:44 - Config loading done.
2023-12-14 15:27:30.221 | INFO     | metagpt.roles.researcher:_act:40 - David(Researcher): ready to CollectLinks
 ["artificial intelligence", "machine learning"]2023-12-14 15:27:31.683 | INFO     | metagpt.provider.openai_api:update_cost:91 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 36, completion_tokens: 11
Traceback (most recent call last):
  File "D:\_GitHubProjects\Clones\MetaGPT\examples\research.py", line 24, in <module>
    asyncio.run(main())
  File "D:\_firefly\Python310\lib\asyncio\runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "D:\_firefly\Python310\lib\asyncio\base_events.py", line 649, in run_until_complete
    return future.result()
  File "D:\_GitHubProjects\Clones\MetaGPT\examples\research.py", line 18, in main
    await role.run(topic)
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\roles\role.py", line 330, in run
    rsp = await self.react()
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\roles\researcher.py", line 68, in react
    msg = await super().react()
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\roles\role.py", line 291, in react
    rsp = await self._act_by_order()
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\roles\role.py", line 278, in _act_by_order
    rsp = await self._act()
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\roles\researcher.py", line 51, in _act
    links = await todo.run(topic, 4, 4)
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\actions\research.py", line 130, in run
    prompt = reduce_message_length(gen_msg(), self.llm.model, system_text, CONFIG.max_tokens_rsp)
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\utils\text.py", line 21, in reduce_message_length
    max_token = TOKEN_MAX.get(model_name, 2048) - count_string_tokens(system_text, model_name) - reserved
  File "D:\_GitHubProjects\Clones\MetaGPT\metagpt\utils\token_counter.py", line 105, in count_string_tokens
    encoding = tiktoken.encoding_for_model(model_name)
  File "D:\_GitHubProjects\Clones\MetaGPT\.venv\lib\site-packages\tiktoken\model.py", line 70, in encoding_for_model
    raise KeyError(
KeyError: 'Could not automatically map chatglm_turbo to a tokeniser. Please use `tiktok.get_encoding` to explicitly get the tokeniser you expect.'
shenchucheng commented 9 months ago

目前仅在openai下做了测试,其他模型需要进一步适配

redlion99 commented 9 months ago

chatglm_turbo 使用的encoding是哪个呢?

HuntZhaozq commented 8 months ago

@shenchucheng 请问现在researcher还是只能在openai上使用吗?