ivanfioravanti / chatbot-ollama

Chatbot Ollama is an open source chat UI for Ollama.
Other
1.33k stars 217 forks source link

When I run the app, the chat bot doesn't reply and says : 'model 'mistral:latest' not found, try pulling it first' #22

Closed Max-J-B123 closed 7 months ago

Max-J-B123 commented 8 months ago

I'm sure its something silly i'm missing.

I tried typing 'docker pull mistral:latest' but it said i didn't have permission

jmcarlile commented 8 months ago

I haven't tried anything yet, but just wanted to throw out my first idea. Have you tried using sudo and your system password?

ivanfioravanti commented 8 months ago

@Max-J-B123 try ollama run mistral:latest

CorentinWicht commented 8 months ago

Same issue here even though I specifically selected llama-2 model: image

here is the error I get: image [OllamaError: model 'mistral:latest' not found, try pulling it first] { name: 'OllamaError' }

herropaul commented 8 months ago

@Max-J-B123 try ollama run mistral:latest

confirming that running this first works:

Screenshot 2023-11-30 at 11 58 11 AM Screenshot 2023-11-30 at 11 58 26 AM
CorentinWicht commented 8 months ago

@Max-J-B123 try ollama run mistral:latest

confirming that running this first works:

Screenshot 2023-11-30 at 11 58 11 AM Screenshot 2023-11-30 at 11 58 26 AM

Many thanks for the suggestion, it now works. It seems like to ensure the UI works smoothly, you have to download the default model (mistral:latest) (through ollama) even though you chose another model (e.g., llama2:latest) in the prompt. That's a bit of a weird behaviour..

herropaul commented 8 months ago

@Max-J-B123 try ollama run mistral:latest

confirming that running this first works: Screenshot 2023-11-30 at 11 58 11 AM Screenshot 2023-11-30 at 11 58 26 AM

Many thanks for the suggestion, it now works. It seems like to ensure the UI works smoothly, you have to download the default model (mistral:latest) (through ollama) even though you chose another model (e.g., llama2:latest) in the prompt. That's a bit of a weird behaviour..

Glad to hear it works on ur end! But yea i feel like the UI should prompt and check to see if the user has installed the models through ollama before using the chat

ZiTAL commented 7 months ago

You don't need to install mistral, you only need to install the model what you want, but creating .env.local file is required.

For example I want to use codellama:

ollama run codellama:latest

I created the file .env.local with this content:

# Chatbot Ollama
DEFAULT_MODEL="codellama:latest"
NEXT_PUBLIC_DEFAULT_SYSTEM_PROMPT=""

And in the web front-end I can use it by default.

The file is copied from .env.local.example

Max-J-B123 commented 7 months ago

"ollama run mistral:latest" this command worked for me