severian42 / GraphRAG-Local-UI

GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
MIT License
1.75k stars 207 forks source link

Error: model "mistral:7b" not found, try pulling it first #18

Closed vinhqdang closed 4 months ago

vinhqdang commented 4 months ago

Hello

Thanks for the code, it seems great.

I have just pulled your code and try to run.

I already called "ollama run mistral" on my local Mac and it seems to work.

image

But when I tried to chat on the gradio UI I have this error

image

Do you have any suggestion I can fix?

severian42 commented 4 months ago

Hey! If you edit the model name to reflect the one you pulled, i.e. mistral it should work for you just fine. The mistral:7b is just how it's uniquely named on my local system

canytam-krystal commented 4 months ago

I have similar problem and modified the model setting but the system still ask for mistral:7b. Screenshot from 2024-07-18 10-17-49

severian42 commented 4 months ago

So I think I found the issue with the current version. Right under the chat window, I have the Model Parameters dropdown menu and from there you can select the model. This menu selection overrides the setting.yaml input. If you use the drop down and select the model it should update your setting.yaml, if not just make sure the two match and it'll work for you