severian42 / GraphRAG-Local-UI

GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
MIT License
1.69k stars 198 forks source link

Error: [Errno 111] Connection refused #30

Closed Essence9999 closed 3 months ago

Essence9999 commented 3 months ago

Using the base URL for LLM, which is compatible with the OpenAI interface image

severian42 commented 3 months ago

Hey! So this version on the public repo is just usable with Ollama at the moment. I use the ollama modules directly which override the openai compatibility. This was done mainly as a quick hack for the ollama-specific embeddings. The new update (coming soon) will be fully OpenAI compatible.

gaussmao commented 3 months ago

Such a great tool. Thanks for sharing it!But I seem to be encountering some issues. I've used docker-compose to run: 1) An Ollama instance and deployed a 7B model (curl http://localhost:11434/ confirms: Ollama is running), 2) Deployed the GraphRAG Ollama UI (these are running in separate containers). The web UI appears to be functioning normally, including uploading an external knowledge base and setting up the LLM (Language Model). However, whenever I send a query, whether local or global, I receive an error: Error: [Errno 111] Connection refused. Do you have any good suggestions on how to resolve this?

Essence9999 commented 3 months ago

Using the command to open the 11434 port:sudo ufw allow 11434,solve my problem