So when i run ollama pull codellama (note the intential change from mistral to codellama) and then continue following your steps then your script goes bananas. Nothing tels your program that i'm now using the codellama model so it still wants to use the mistral model. Which i don't have so it fails.
Hi,
https://github.com/PromptEngineer48/Ollama/blob/main/2-ollama-privateGPT-chat-with-docs/privateGPT.py has a couple environment variables. Like
MODEL
. Nothing sets those variables.So when i run
ollama pull codellama
(note the intential change frommistral
tocodellama
) and then continue following your steps then your script goes bananas. Nothing tels your program that i'm now using thecodellama
model so it still wants to use themistral
model. Which i don't have so it fails.