severian42 / GraphRAG-Local-UI

GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
MIT License
1.75k stars 207 forks source link

No LLM models when ollama is selected #33

Closed psmukhopadhyay closed 4 months ago

psmukhopadhyay commented 4 months ago

Trying this version (1.0) in Ubuntu 22.04.

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasets 2.19.1 requires fsspec[http]<=2024.3.1,>=2023.1.0, but you have fsspec 2024.6.1 which is incompatible. opentelemetry-api 1.24.0 requires importlib-metadata<=7.0,>=6.0, but you have importlib-metadata 8.0.0 which is incompatible.

Successfully fetched Ollama models: ['llama2:latest', 'mistral:7b-instruct-v0.2-q4_K_M', 'mistral:latest', 'nomic-embed-text:latest']

Change of the versions of these two packages (datasets and fsspec) allowed me to run the the command gradio app.py. But after initializing the output folder 20240713-154301, when I select Ollama in the LLM settinigs and Embedding settings it is showing me no models. Although if I refresh and don't select any of the options, it is showing me LLM and embedding models. What am I missing?

image

severian42 commented 4 months ago

Hey! When you change the provider it doesn't automatically switch the base URL (was debating about doing that). You will still need to enter the ollama host your models are at, typically localhost:11434/v1. The provider selection buttons have more to do with the internal routing rather than setting the base URL. Sorry for the confusion, I'll make this more straightforward. Let me know if that solves it for you!