Open gzmud opened 2 weeks ago
If you can chat with that LLM deployed by Ollama, then it goes to the NLP capability of the LLM. Go to the larger model.
Is there a way to record the complete communication between Ragflow and ollama for debugging purposes?
Is there an existing issue for the same bug?
Branch name
dev
Commit ID
dev (image ID: 6ee854751c7e)
Other environment information
Actual behavior
实际结果 (Actual Result) 实际操作中,GraphRag未能正确提取关系,输出结果与预期不符。
Expected behavior
期望结果 (Expected Result) 使用GraphRag功能时,期望能够从输入文本中正确提取出关系。
Steps to reproduce
Additional information
that‘s all