s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
359 stars 25 forks source link

first install, first try : error raised #54

Closed rileyrg closed 5 months ago

rileyrg commented 5 months ago

I installed ollama and had that pull zephyr.

When I "C-c e a i" to ask Ella something:- I type in a question, hit enter and ....

Debugger entered--Lisp error: (wrong-number-of-arguments (2 . 2) 1)

f(compiled-function (_ msg) #<bytecode -0x39ad22ed7423a08>)("Unknown error calling ollama")

llm-request-callback-in-buffer(#<buffer ellama when is christmas? (mistral:7b-instruct-v0.2-q6/K).org> #f(compiled-function (_ msg) #<bytecode -0x39ad22ed7423a08>) "Unknown error calling ollama")

f(compiled-function ( ) #<bytecode 0x1a7d3104803e9095>)(404 ((error . "model 'mistral:7b-instruct-v0.2-q6/K' not found, t...")))

f(compiled-function ( on-success on-error) #<bytecode -0x46530f480170ce3>)((:error (error http 404)) nil #f(compiled-function ( _) #<bytecode 0x1a7d3104803e9095>))

url-http-activate-callback() url-http-content-length-after-change-function(125 207 82) url-http-wait-for-headers-change-function(1 212 211) url-http-generic-filter(# "HTTP/1.1 404 Not Found\15\nContent-Type: application/...")

I havent tried to debug or anything, just in case this is obvious to someone here. Using consult btw.

s-kostyaev commented 5 months ago

Remove ellama-provider from your configuration or pull "mistral:7b-instruct-v0.2-q6/K"

rileyrg commented 5 months ago

Ok. Just fyi I used the exact install use package from the GitHub readme.

(use-package ellama :init (setopt ellama-language "German") (require 'llm-ollama) (setopt ellama-provider (make-llm-ollama :chat-model "mistral:7b-instruct-v0.2-q6/K" :embedding-model "mistral:7b-instruct-v0.2-q6/K")) ;; Predefined llm providers for interactive switching. ;; You shouldn't add ollama providers here - it can be selected interactively ;; without it. It is just example. (setopt ellama-providers '(("zephyr" . (make-llm-ollama :chat-model "zephyr:7b-beta-q6_K" :embedding-model "zephyr:7b-beta-q6_K")) ("mistral" . (make-llm-ollama :chat-model "mistral:7b-instruct-v0.2-q6_K" :embedding-model "mistral:7b-instruct-v0.2-q6_K")) ("mixtral" . (make-llm-ollama :chat-model "mixtral:8x7b-instruct-v0.1-q3/K/M-4k" :embedding-model "mixtral:8x7b-instruct-v0.1-q3/K/M-4k")))))

-- https://github.com/rileyrg?tab=repositories

On Mon, 22 Jan 2024, 19:54 Sergey Kostyaev, @.***> wrote:

Remove ellama-provider from your configuration or pull "mistral:7b-instruct-v0.2-q6/K"

— Reply to this email directly, view it on GitHub https://github.com/s-kostyaev/ellama/issues/54#issuecomment-1904610107, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACYTC2KCIICELQ4GT2FAKLYP2YW5AVCNFSM6AAAAABCFWDKICVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMBUGYYTAMJQG4 . You are receiving this because you authored the thread.Message ID: @.***>

s-kostyaev commented 5 months ago

I think you read it before copy pasting 😃

s-kostyaev commented 5 months ago

I have added more comments about it into readme. You are not the first person with this exact problem.

rileyrg commented 5 months ago

Sergey Kostyaev @.***> writes:

Remove ellama-provider from your configuration or pull "mistral:7b-instruct-v0.2-q6/K"

I'm sorry but I dont really understand what you mean. I did read the readme, but I dont understand. By "pull mistral" you mean to delete it from the ellama-providers? Not pull from github?

s-kostyaev commented 5 months ago

I mean call in your terminal:

ollama pull mistral:7b-instruct-v0.2-q6_K # here was also typo
s-kostyaev commented 5 months ago

Or remove all ellama configuration from your config file and restart emacs - all should works fine by default if you have ollama installed and zephyr pulled.

rileyrg commented 5 months ago

FIY I just deleted all lines with provider in and it worked.

(use-package ellama :init (setopt ellama-language "German") (require 'llm-ollama))

s-kostyaev commented 5 months ago

FIY I just deleted all lines with provider in and it worked.

(use-package ellama :init (setopt ellama-language "German") (require 'llm-ollama))

Even that lines can be removed 🙂

rileyrg commented 5 months ago

FYI I did the ollama pull you suggested

ollama pull mistral:7b-instruct-v0.2-q6_K

and it still "didnt work" - I checked the "mistral:7b-instruct-v0.2-q6_K" in the config and all seemed to look ok.

I dont know enough about this and dont want to waste your time, but I'll come back to it and provide an update if I can.

Thank you for your help.

On Mon, 22 Jan 2024 at 20:40, Sergey Kostyaev @.***> wrote:

FIY I just deleted all lines with provider in and it worked.

(use-package ellama :init (setopt ellama-language "German") (require 'llm-ollama))

Even that lines can be removed 🙂

— Reply to this email directly, view it on GitHub https://github.com/s-kostyaev/ellama/issues/54#issuecomment-1904682329, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACYTC6PEZVA54PQ43LV263YP26CJAVCNFSM6AAAAABCFWDKICVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMBUGY4DEMZSHE . You are receiving this because you authored the thread.Message ID: @.***>

s-kostyaev commented 5 months ago

To work without configuration it should be:

ollama pull zephyr

in your terminal. If you use:

ollama pull mistral:7b-instruct-v0.2-q6_K

You need this ellama configuration:

  (use-package ellama
    :init
    (require 'llm-ollama)
    (setopt ellama-provider
                    (make-llm-ollama
                     ;; this model should be pulled to use it
                     ;; value should be the same as you print in terminal during pull
                     :chat-model "mistral:7b-instruct-v0.2-q6_K"
                     :embedding-model "mistral:7b-instruct-v0.2-q6_K")))

Check updated readme.

s-kostyaev commented 5 months ago

Close this issue if your setup works, or feel free to go for advice here.

s-kostyaev commented 5 months ago

Closing this due to inactivity. @rileyrg feel free to reopen it.