Closed Essence9999 closed 3 months ago
Hey! So this version on the public repo is just usable with Ollama at the moment. I use the ollama modules directly which override the openai compatibility. This was done mainly as a quick hack for the ollama-specific embeddings. The new update (coming soon) will be fully OpenAI compatible.
Such a great tool. Thanks for sharing it!But I seem to be encountering some issues. I've used docker-compose to run: 1) An Ollama instance and deployed a 7B model (curl http://localhost:11434/ confirms: Ollama is running), 2) Deployed the GraphRAG Ollama UI (these are running in separate containers). The web UI appears to be functioning normally, including uploading an external knowledge base and setting up the LLM (Language Model). However, whenever I send a query, whether local or global, I receive an error: Error: [Errno 111] Connection refused. Do you have any good suggestions on how to resolve this?
Using the command to open the 11434 port:sudo ufw allow 11434,solve my problem
Using the base URL for LLM, which is compatible with the OpenAI interface