severian42 / GraphRAG-Local-UI

GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
MIT License
1.75k stars 207 forks source link

Local search work but global search not. And running index_app.py report an error no attribute 'rstrip' #80

Open goodmaney opened 3 months ago

goodmaney commented 3 months ago

the app.py report: Error parsing query response: Expecting value: line 1 column 1 (char 0).

Then I try using the command to global search ,it report me : Error parsing search response json. ........... json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) .

And

Run index_app.py report an error no attribute 'rstrip' I edit the .env file. The app.py can run .but the index_app.py report :AttributeError: 'NoneType' object has no attribute 'rstrip'. Then I change the LLM_API_BASE EMBEDDINGS_API_BASE in code, not work

I use xinference not ollama. LLM is llama3.1 8B

shaoqing404 commented 3 months ago

你可以找到初始化 global_search_engine = GlobalSearch( llm=llm, context_builder=global_context_builder, token_encoder=token_encoder, max_data_tokens=12000, map_llm_params=map_llm_params, reduce_llm_params=reduce_llm_params, allow_general_knowledge=False, json_mode=True, context_builder_params=global_context_builder_params, concurrent_coroutines=32, response_type="multiple paragraphs", )的部分,将json_mode 改为False

Tovi163 commented 3 months ago

the app.py report: Error parsing query response: Expecting value: line 1 column 1 (char 0).

Then I try using the command to global search ,it report me : Error parsing search response json. ........... json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) .

And

Run index_app.py report an error no attribute 'rstrip' I edit the .env file. The app.py can run .but the index_app.py report :AttributeError: 'NoneType' object has no attribute 'rstrip'. Then I change the LLM_API_BASE EMBEDDINGS_API_BASE in code, not work

I use xinference not ollama. LLM is llama3.1 8B

@goodmaney try "direct query" first, to check if graphrag is ok with xinference