Closed Max-J-B123 closed 7 months ago
I haven't tried anything yet, but just wanted to throw out my first idea. Have you tried using sudo and your system password?
@Max-J-B123 try ollama run mistral:latest
Same issue here even though I specifically selected llama-2 model:
here is the error I get:
[OllamaError: model 'mistral:latest' not found, try pulling it first] { name: 'OllamaError' }
@Max-J-B123 try
ollama run mistral:latest
confirming that running this first works:
@Max-J-B123 try
ollama run mistral:latest
confirming that running this first works:
![]()
Many thanks for the suggestion, it now works. It seems like to ensure the UI works smoothly, you have to download the default model (mistral:latest) (through ollama) even though you chose another model (e.g., llama2:latest) in the prompt. That's a bit of a weird behaviour..
@Max-J-B123 try
ollama run mistral:latest
confirming that running this first works:
![]()
Many thanks for the suggestion, it now works. It seems like to ensure the UI works smoothly, you have to download the default model (mistral:latest) (through ollama) even though you chose another model (e.g., llama2:latest) in the prompt. That's a bit of a weird behaviour..
Glad to hear it works on ur end! But yea i feel like the UI should prompt and check to see if the user has installed the models through ollama before using the chat
You don't need to install mistral, you only need to install the model what you want, but creating .env.local file is required.
For example I want to use codellama:
ollama run codellama:latest
I created the file .env.local with this content:
# Chatbot Ollama
DEFAULT_MODEL="codellama:latest"
NEXT_PUBLIC_DEFAULT_SYSTEM_PROMPT=""
And in the web front-end I can use it by default.
The file is copied from .env.local.example
"ollama run mistral:latest" this command worked for me
I'm sure its something silly i'm missing.
I tried typing 'docker pull mistral:latest' but it said i didn't have permission