infiniflow / ragflow

RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
https://ragflow.io
Apache License 2.0
20.19k stars 2k forks source link

[Bug] Using gemini-1.5-flash-lastest as Chat model throws error with knowledge graph creation #2720

Open marcfon opened 2 weeks ago

marcfon commented 2 weeks ago

Is there an existing issue for the same bug?

Branch name

main

Commit ID

--

Other environment information

No response

Actual behavior

Traceback (most recent call last):
  File "/ragflow/graphrag/graph_extractor.py", line 128, in __call__
    result, token_count = self._process_document(text, prompt_variables)
  File "/ragflow/graphrag/graph_extractor.py", line 177, in _process_document
    if response.find("**ERROR**") >=0: raise Exception(response)
Exception: **ERROR**: 400 Please use a valid role: user, model.
**ERROR**: contents must not be empty

Expected behavior

No response

Steps to reproduce

1. Set `Model Providers > System Model Settings > Chat model` to `gemini-1.5-flash-lastest`
2. Create a knowledge graph 
3. Process a file

Additional information

Processing the file works fine when the Chat model is set to gpt-4o-mini.

KevinHuSh commented 2 weeks ago

What's the llm factory/supplier do you select? Gemini?

marcfon commented 2 weeks ago

What's the llm factory/supplier do you select? Gemini?

Yes, Gemini.