Open guoxiangke opened 3 weeks ago
Hi, finally, I make this run in my local. but the chat give nothing, and no error.
steps:
make this repo works on local.
pip uninstall aiofiles graphrag chainlit -y pip install aiofiles==23.1.0 pip install chainlit==1.1.306 pip install --no-deps graphrag
copy my.txt to input folder and run "python -m graphrag.index --root ." with gpt-4o-mini (ollama + llama3.1 not work on local)
then litellm --model ollama/llama3.1:8b --api_base http://localhost:11434
this also works well: python -m graphrag.query --root . --method global "my question?"
Have you solved the problem now?
Hi, finally, I make this run in my local. but the chat give nothing, and no error.
steps:
make this repo works on local.
copy my.txt to input folder and run "python -m graphrag.index --root ." with gpt-4o-mini (ollama + llama3.1 not work on local)
then litellm --model ollama/llama3.1:8b --api_base http://localhost:11434