greeschenko / vim9-ollama

Local driven AI assistent plugin written in the cutting-edge Vim9 script and powered by ollama
MIT License
8 stars 3 forks source link

already in use? Key not in response? #2

Closed KUKARAF closed 1 month ago

KUKARAF commented 5 months ago

Steps to reproduce

  1. install ollama in WSL ollama version is 0.1.29 1a. get vim version: VIM - Vi IMproved 9.1 (2024 Jan 02, compiled Jan 01 1980 00:00:00) Included patches: 1-75 Compiled by nixbld
  2. download required models
  3. start vim
  4. what do you mean already in use? Why are you trying to start a server if you are meant to communicate with existing server? 4a. image
  5. ok whatever >> sudo systemctl start ollama.service
  6. start vim again image

Question

What is the state of the ollama service and server meant to be?

greeschenko commented 5 months ago

Hello, error from 4a means that ollama serve already started and all fine in future updates I add check for this first sudo systemctl stop ollama.service then try to start ollama serve in terminal and show it and if no error in ollama server please try ollama run codellama:latest "write simple web api in javascript"

alexandremcosta commented 5 months ago

What about the error on 6?

Nevermind, I found the reason: I was using llama3 instead of codellama. The plugin only works with codellama model.

greeschenko commented 4 months ago

Hello yes for now work only with codellama you need pull 2 models

ollama pull codellama:latest
ollama pull codellama:7b-code

in future I will try to attach another models but for now only codellama work as I expect