s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
359 stars 25 forks source link

trouble running ollama with gemma:2b #79

Closed tvraman closed 4 months ago

tvraman commented 4 months ago

dbg-log.org

s-kostyaev commented 4 months ago

@tvraman it is not a link

tvraman commented 4 months ago

link? I thought I attached an org file showing the backtrace ... --

s-kostyaev commented 4 months ago

I can't see attached file, just a file name.

sigma-957 commented 4 months ago

FWIW, I couldn't actually get gemma to work at all with Ollama, so this is potentially something upstream of this package.

tvraman commented 4 months ago

Oops -- I must have screwed up.

Here is the file:

--

tvraman commented 4 months ago

gemma is working with ollama and gptel

--

s-kostyaev commented 4 months ago

@tvraman Still no file content. Try to add it in github web interface. Can't reproduce your issue, gemma:2b works fine for me. Try to update ollama, in latest (couple minutes before) update announcement it said something about gemma support on all platforms.

tvraman commented 4 months ago

Gemma requires ollama 1.26 and I already updated that before I tried gemma.

Will try sending the attachment via github --

tvraman commented 4 months ago

With ellama-providers set to:

+begin_src

(("ollama model" ellama-get-ollama-local-model) ("default model" . ellama-provider) ("ollama model" ellama-get-ollama-local-model) ("default model" . ellama-provider) ("ollama model" ellama-get-ollama-local-model))

+end_src

and calling ellama-provider-select to pick ollama and the installed gemma:2b

I get the following backtrace.

+begin_src

Debugger entered--Lisp error: (error "Format specifier doesn’t match argument type") (format "%s://%s:%d/api/%s" "http" nil nil "generate") (llm-ollama--url #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") "generate") (#f(compiled-function (provider prompt) #<bytecode 0x14cdd6b2542495a>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil)) (apply #f(compiled-function (provider prompt) #<bytecode 0x14cdd6b2542495a>) (#s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil))) (#f(compiled-function (&rest args) #<bytecode -0x134e5003bf9c4f49>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil)) (apply #f(compiled-function (&rest args) #<bytecode -0x134e5003bf9c4f49>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil)) (llm-chat #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil)) (ellama-get-name "why is the sky blue") (ellama-generate-name-by-llm #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") ellama "why is the sky blue") (ellama-generate-name #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") ellama "why is the sky blue") (ellama-new-session #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") "why is the sky blue") (#f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue" nil) (ad-Advice-ellama-chat #f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue") (apply ad-Advice-ellama-chat #f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue") (ellama-chat "why is the sky blue") (# ellama-chat "why is the sky blue") (apply # ellama-chat "why is the sky blue") (ad-Advice-funcall-interactively # ellama-chat "why is the sky blue") (apply ad-Advice-funcall-interactively # (ellama-chat "why is the sky blue")) (funcall-interactively ellama-chat "why is the sky blue") (# ellama-chat nil nil) (apply # ellama-chat (nil nil)) (call-interactively@ido-cr+-record-current-command # ellama-chat nil nil) (apply call-interactively@ido-cr+-record-current-command # (ellama-chat nil nil)) (call-interactively ellama-chat nil nil) (command-execute ellama-chat)

+end_src

s-kostyaev commented 4 months ago

@tvraman Try to update emacs packages (ellama and llm should be updated to latest versions).

s-kostyaev commented 4 months ago

@tvraman which emacs version do you have?

s-kostyaev commented 4 months ago

@tvraman Thank you for report. Should be fixed in 0.8.7. After update try to call ellama-provider-select one more time.

tvraman commented 4 months ago

Emacs 30 built weekly from Git Head --

tvraman commented 4 months ago

thanks! will try later today --

tvraman commented 4 months ago

works like a charm after updating ellama --