Closed SerhiyProtsenko closed 1 year ago
Try ollama pull mistral
and ollama run mistral
?
Yes, I did:
ollama pull mistral
ollama run mistral
(base) serhiy@serhiy-protsenko:~/Downloads/ollama$ ollama pull mistral
pulling manifest
pulling 6ae280299950... 100% |█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| (4.1/4.1 GB, 60 TB/s)
pulling fede2d8d6c1f... 100% |███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| (29/29 B, 1.1 MB/s)
pulling b96850d2e482... 100% |██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| (307/307 B, 11 MB/s)
verifying sha256 digest
writing manifest
removing any unused layers
success
(base) serhiy@serhiy-protsenko:~/Downloads/ollama$ ollama run mistral
>>> Send a message (/? for help)
and then in other CLI:
OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve
OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve
2023/10/15 17:06:59 images.go:995: total blobs: 0
2023/10/15 17:06:59 images.go:1002: total unused blobs removed: 0
2023/10/15 17:06:59 routes.go:614: Listening on 127.0.0.1:11435
[GIN] 2023/10/15 - 17:07:57 | 204 | 56.44µs | 127.0.0.1 | OPTIONS "/api/generate"
[GIN] 2023/10/15 - 17:07:57 | 404 | 193.095µs | 127.0.0.1 | POST "/api/generate"
[GIN] 2023/10/15 - 17:08:08 | 404 | 163.055µs | 127.0.0.1 | POST "/api/generate"
When I put the .pdf it's fine and all is well. But then I asked the question and got the error as above on the screenshot
I have also checked:
(base) serhiy@serhiy-protsenko:~/Downloads/ollama$ ollama list
NAME ID SIZE MODIFIED
mistral:latest 8aa307f73b26 4.1 GB 11 minutes ago
(base) serhiy@serhiy-protsenko:~/Downloads/ollama$
@SerhiyProtsenko are you on Linux? If so, try running this first to make sure the model gets pulled correctly:
OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
If not, let me know will help you make sure you get up and running 👍
Yes, Linux.
This
OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
helped! Thank you!
Now everything works as it should!
I'm having trouble starting it. I installed everything as described in the repository, I can communicate with the model in the CLI, but when using the web service I get errors, see screenshot