Closed ayanami108 closed 4 months ago
same issue
The issue is likely due to the insufficient context window of your local LLM instance, causing it to be unable to read the entire prompt. If you are using Ollama, you can try increasing the num_ctx
parameter. Refer to the Ollama Model File Documentation to create a new model with a larger num_ctx
.
Additionally, see the related issue https://github.com/ollama/ollama/issues/2653 for more information.
Consolidating alternate model issues here: https://github.com/microsoft/graphrag/issues/657
Describe the issue
site-packages\graphrag\query\structured_search\global_search\search.py", line 200, in _map_response_single_batch,i print search_response,and describe like this:
Please provide me with the story so I can identify the top themes.
how can i fix this?
Steps to reproduce
No response
GraphRAG Config Used
No response
Logs and screenshots
No response
Additional Information