Closed KUKARAF closed 1 month ago
Hello,
error from 4a means that ollama serve already started and all fine in future updates I add check for this
first sudo systemctl stop ollama.service
then try to start ollama serve
in terminal and show it
and if no error in ollama server please try ollama run codellama:latest "write simple web api in javascript"
What about the error on 6?
Nevermind, I found the reason: I was using llama3
instead of codellama
.
The plugin only works with codellama
model.
Hello yes for now work only with codellama
you need pull 2 models
ollama pull codellama:latest
ollama pull codellama:7b-code
in future I will try to attach another models but for now only codellama work as I expect
Steps to reproduce
ollama version is 0.1.29
1a. get vim version:VIM - Vi IMproved 9.1 (2024 Jan 02, compiled Jan 01 1980 00:00:00) Included patches: 1-75 Compiled by nixbld
sudo systemctl start ollama.service
Question
What is the state of the ollama service and server meant to be?