Closed vinhqdang closed 4 months ago
Hey! If you edit the model name to reflect the one you pulled, i.e. mistral
it should work for you just fine. The mistral:7b
is just how it's uniquely named on my local system
I have similar problem and modified the model setting but the system still ask for mistral:7b.
So I think I found the issue with the current version. Right under the chat window, I have the Model Parameters dropdown menu and from there you can select the model. This menu selection overrides the setting.yaml input. If you use the drop down and select the model it should update your setting.yaml, if not just make sure the two match and it'll work for you
Hello
Thanks for the code, it seems great.
I have just pulled your code and try to run.
I already called "ollama run mistral" on my local Mac and it seems to work.
But when I tried to chat on the gradio UI I have this error
Do you have any suggestion I can fix?