severian42 / GraphRAG-Local-UI

GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
MIT License
1.76k stars 208 forks source link

Indexing Progress create_final_community_reports❌ Errors occurred during the pipeline run, see logs for more details. #64

Open zxk-master opened 4 months ago

zxk-master commented 4 months ago

setting.xml model_supports_json: false

use win11 local llama3.1:latest

error log message \GraphRAG-Local-UI\graphrag\graphrag\llm\base\base_llm.py\", line 65, in _invoke_json\n raise NotImplementedError(msg)\nNotImplementedError: JSON output not supported by this LLM\n", "source": "JSON output not supported by this LLM", "details": null}

severian42 commented 4 months ago

Hey! I'm not 100% sure why it's not allowing you to disable the JSON mode and not run. I haven't tried running an Index without JSON yet so I will need to mess around and see what might be causing it

fengtf commented 3 months ago

I have the same problem

Sendarg commented 2 months ago

I have the same problem