geekan / MetaGPT

🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
https://deepwisdom.ai/
MIT License
44.06k stars 5.24k forks source link

ValueError: Creator not registered for key: LLMType.OLLAMA #1382

Open omige opened 3 months ago

omige commented 3 months ago

System version:win 10 Python version:3.9.6 MetaGPT version or branch:0.8

Bug description

config2.yaml llm: api_type: 'ollama' base_url: 'http://192.168.0.70:11434/api' model: 'qwen2:1.5b' max_token: 2048

repair_llm_output: true

embedding: api_type: 'ollama' base_url: 'http://192.168.0.70:11434/api' model: 'qwen2:1.5b'

jupyter notebook code: import asyncio

from metagpt.rag.engines import SimpleEngine from metagpt.const import EXAMPLE_DATA_PATH

DOC_PATH = EXAMPLE_DATA_PATH / "quanwen.txt"

async def main(): engine = SimpleEngine.from_docs(input_files=[DOC_PATH]) answer = await engine.aquery("自动杂散测试系统包括哪些模块?") print(answer)

await main()

ValueError: ValueError Traceback (most recent call last) Cell In[5], line 1 ----> 1 await main()

Cell In[4], line 9, in main() 8 async def main(): ----> 9 engine = SimpleEngine.from_docs(input_files=[DOC_PATH]) 10 answer = await engine.aquery("自动杂散测试系统包括哪些模块?") 11 print(answer)

File ~\AppData\Roaming\Python\Python39\site-packages\metagpt\rag\engines\simple.py:109, in SimpleEngine.from_docs(cls, input_dir, input_files, transformations, embed_model, llm, retriever_configs, ranker_configs) 103 documents = SimpleDirectoryReader(input_dir=input_dir, input_files=input_files).load_data() 104 cls._fix_document_metadata(documents) 106 index = VectorStoreIndex.from_documents( 107 documents=documents, 108 transformations=transformations or [SentenceSplitter()], --> 109 embed_model=cls._resolve_embed_model(embed_model, retriever_configs), 110 ) 111 return cls._from_index(index, llm=llm, retriever_configs=retriever_configs, ranker_configs=ranker_configs)

File ~\AppData\Roaming\Python\Python39\site-packages\metagpt\rag\engines\simple.py:261, in SimpleEngine._resolve_embed_model(embed_model, configs) 258 if configs and all(isinstance(c, NoEmbedding) for c in configs): 259 return MockEmbedding(embed_dim=1) --> 261 return embed_model or get_rag_embedding()

File ~\AppData\Roaming\Python\Python39\site-packages\metagpt\rag\factories\embedding.py:24, in RAGEmbeddingFactory.get_rag_embedding(self, key) 22 def get_rag_embedding(self, key: LLMType = None) -> BaseEmbedding: 23 """Key is LLMType, default use config.llm.api_type.""" ---> 24 return super().get_instance(key or config.llm.api_type)

File ~\AppData\Roaming\Python\Python39\site-packages\metagpt\rag\factories\base.py:29, in GenericFactory.get_instance(self, key, kwargs) 26 if creator: 27 return creator(kwargs) ---> 29 raise ValueError(f"Creator not registered for key: {key}")

ValueError: Creator not registered for key: LLMType.OLLAMA

shenchucheng commented 3 months ago

Please use the latest version of MetaGPT and try again.

omige commented 2 months ago

have already tried the latest version of v0.8.1, but still the same error ValueError: Creator not registered for key: LLMType.OLLAMA

hulei2018 commented 2 months ago

i have the same issue

shenchucheng commented 2 months ago

Please try pip install --upgrade git+https://github.com/geekan/MetaGPT.git.

hulei2018 commented 2 months ago

i have tried, but i have still this issue

shenchucheng commented 2 months ago

Could you please upload your error logs?

hulei2018 commented 2 months ago

代码: import asyncio

from metagpt.rag.engines import SimpleEngine

doc_path = "data_base/travel.txt"

async def main(): engine = SimpleEngine.from_docs(input_files=[doc_path]) answer = await engine.query("what does Bob like?") print(answer)

if name == "main": asyncio.run(main())

报错: 2024-07-19 16:47:39.865 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to D:\py_workspace\gpt_agent Traceback (most recent call last): File "D:\py_workspace\gpt_agent\knowledge_rags.py", line 15, in asyncio.run(main()) File "D:\anaconda_software\envs\agent\lib\asyncio\runners.py", line 44, in run return loop.run_until_complete(main) File "D:\anaconda_software\envs\agent\lib\asyncio\base_events.py", line 642, in run_until_complete return future.result() File "D:\py_workspace\gpt_agent\knowledge_rags.py", line 9, in main engine = SimpleEngine.from_docs(input_files=[doc_path]) File "D:\anaconda_software\envs\agent\lib\site-packages\metagpt\rag\engines\simple.py", line 109, in from_docs embed_model=cls._resolve_embed_model(embed_model, retriever_configs), File "D:\anaconda_software\envs\agent\lib\site-packages\metagpt\rag\engines\simple.py", line 261, in _resolve_embed_model return embed_model or get_rag_embedding() File "D:\anaconda_software\envs\agent\lib\site-packages\metagpt\rag\factories\embedding.py", line 24, in get_rag_embedding return super().get_instance(key or config.llm.api_type) File "D:\anaconda_software\envs\agent\lib\site-packages\metagpt\rag\factories\base.py", line 29, in get_instance raise ValueError(f"Creator not registered for key: {key}") ValueError: Creator not registered for key: LLMType.ZHIPUAI

shenchucheng commented 2 months ago

@hulei2018 So far, only OPENAI, AZURE, GEMINI, and OLLAMA are supported. ZHIPUAI support is not available yet.

See https://github.com/geekan/MetaGPT/blob/main/metagpt/rag/factories/embedding.py#L22

class RAGEmbeddingFactory(GenericFactory):
    """Create LlamaIndex Embedding with MetaGPT's embedding config."""

    def __init__(self):
        creators = {
            EmbeddingType.OPENAI: self._create_openai,
            EmbeddingType.AZURE: self._create_azure,
            EmbeddingType.GEMINI: self._create_gemini,
            EmbeddingType.OLLAMA: self._create_ollama,
            # For backward compatibility
            LLMType.OPENAI: self._create_openai,
            LLMType.AZURE: self._create_azure,
        }
        super().__init__(creators)
yuzebo-q commented 3 weeks ago

I tried Gemini, but it still didn't work. The error shows ValueError: Creator not registered for key: LLMType.GEMINI. Here is my config:

yaml llm: api_type: "gemini" api_key: "my_key" dimensions: "32768" # output dimension of embedding model Additionally, I'm using an example with RAG, and if following the tutorial, it should include another config for the embedding like this:

yaml embedding: api_type: "gemini" api_key: "my_key" dimensions: "32768" # output dimension of embedding model However, it still uses the llm config.

seehi commented 3 weeks ago

I tried Gemini, but it still didn't work. The error shows ValueError: Creator not registered for key: LLMType.GEMINI. Here is my config:

yaml llm: api_type: "gemini" api_key: "my_key" dimensions: "32768" # output dimension of embedding model Additionally, I'm using an example with RAG, and if following the tutorial, it should include another config for the embedding like this:

yaml embedding: api_type: "gemini" api_key: "my_key" dimensions: "32768" # output dimension of embedding model However, it still uses the llm config.

Need to use the main branch of MetaGPT,https://docs.deepwisdom.ai/main/en/guide/in_depth_guides/rag_module.html

yuzebo-q commented 3 weeks ago

I have installed according to the aforementioned documentation, and here is my version:

屏幕截图 2024-09-10 124142
seehi commented 3 weeks ago

HEAD at v0.8.1, not the main branch

yuzebo-q commented 3 weeks ago

"Alright, thank you."