Open guoxiangke opened 2 months ago
Hi, finally, I make this run in my local. but the chat give nothing, and no error.
steps:
make this repo works on local.
pip uninstall aiofiles graphrag chainlit -y pip install aiofiles==23.1.0 pip install chainlit==1.1.306 pip install --no-deps graphrag
copy my.txt to input folder and run "python -m graphrag.index --root ." with gpt-4o-mini (ollama + llama3.1 not work on local)
then litellm --model ollama/llama3.1:8b --api_base http://localhost:11434
this also works well: python -m graphrag.query --root . --method global "my question?"
Have you solved the problem now?
同样的问题,解决了吗?
这是来自QQ邮箱的假期自动回复邮件。 您好,我最近正在休假中,无法亲自回复您的邮件。我将在假期结束后,尽快给您回复。
Hi, finally, I make this run in my local. but the chat give nothing, and no error.
steps:
make this repo works on local.
copy my.txt to input folder and run "python -m graphrag.index --root ." with gpt-4o-mini (ollama + llama3.1 not work on local)
then litellm --model ollama/llama3.1:8b --api_base http://localhost:11434