SonicWarrior1 / pdfchat

Local PDF Chat Application with Mistral 7B LLM, Langchain, Ollama, and Streamlit
MIT License
119 stars 40 forks source link

Naiva Question: #1

Closed csv610 closed 10 months ago

csv610 commented 11 months ago

Hello,

I have installed Ollama on my machine. But do not know how to solve the problem. Can you help?

ValueError: Ollama call failed with status code 404. Details: model 'mistral:instruct' not found, try pulling it first

SonicWarrior1 commented 11 months ago

Hello,

I have installed Ollama on my machine. But do not know how to solve the problem. Can you help?

ValueError: Ollama call failed with status code 404. Details: model 'mistral:instruct' not found, try pulling it first

The error is coming because you haven't pulled/downloaded the Mistral 7b model on to your Local Machine. If u have installed ollama in docker just like i did then just run this command in cmd and make sure the docker container is running

docker exec -it ollama ollama run mistral:instruct

For the first time this command will download the mistral 7b model and then run it. After this u can just close the cmd window and try running the code again