Closed tvraman closed 4 months ago
@tvraman it is not a link
link? I thought I attached an org file showing the backtrace ... --
I can't see attached file, just a file name.
FWIW, I couldn't actually get gemma to work at all with Ollama, so this is potentially something upstream of this package.
Oops -- I must have screwed up.
Here is the file:
--
gemma is working with ollama and gptel
--
@tvraman Still no file content. Try to add it in github web interface. Can't reproduce your issue, gemma:2b works fine for me. Try to update ollama
, in latest (couple minutes before) update announcement it said something about gemma
support on all platforms.
Gemma requires ollama 1.26 and I already updated that before I tried gemma.
Will try sending the attachment via github --
With ellama-providers set to:
(("ollama model" ellama-get-ollama-local-model) ("default model" . ellama-provider) ("ollama model" ellama-get-ollama-local-model) ("default model" . ellama-provider) ("ollama model" ellama-get-ollama-local-model))
and calling ellama-provider-select to pick ollama and the installed gemma:2b
I get the following backtrace.
Debugger entered--Lisp error: (error "Format specifier doesn’t match argument type")
(format "%s://%s:%d/api/%s" "http" nil nil "generate")
(llm-ollama--url #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") "generate")
(#f(compiled-function (provider prompt) #<bytecode 0x14cdd6b2542495a>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil))
(apply #f(compiled-function (provider prompt) #<bytecode 0x14cdd6b2542495a>) (#s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil)))
(#f(compiled-function (&rest args) #<bytecode -0x134e5003bf9c4f49>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil))
(apply #f(compiled-function (&rest args) #<bytecode -0x134e5003bf9c4f49>) #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil))
(llm-chat #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") #s(llm-chat-prompt :context nil :examples nil :interactions (#s(llm-chat-prompt-interaction :role user :content "I will get you user query, you should return short topic only, what this conversation about. NEVER respond to query itself. Topic must be short and concise.\nFor example:\nQuery: Why is sky blue?\nTopic: Blue sky\n\nQuery: why is the sky blue\nTopic:")) :temperature nil :max-tokens nil))
(ellama-get-name "why is the sky blue")
(ellama-generate-name-by-llm #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") ellama "why is the sky blue")
(ellama-generate-name #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") ellama "why is the sky blue")
(ellama-new-session #s(llm-ollama :scheme "http" :host nil :port nil :chat-model "gemma:2b" :embedding-model "gemma:2b") "why is the sky blue")
(#f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue" nil)
(ad-Advice-ellama-chat #f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue")
(apply ad-Advice-ellama-chat #f(compiled-function (prompt &optional create-session) "Send PROMPT to ellama chat with conversation history.\n\nIf CREATE-SESSION set, creates new session even if there is an active session." (interactive "sAsk ellama: ") #<bytecode 0x879cdb595d165d4>) "why is the sky blue")
(ellama-chat "why is the sky blue")
(#
@tvraman Try to update emacs packages (ellama and llm should be updated to latest versions).
@tvraman which emacs version do you have?
@tvraman Thank you for report. Should be fixed in 0.8.7. After update try to call ellama-provider-select
one more time.
Emacs 30 built weekly from Git Head --
thanks! will try later today --
works like a charm after updating ellama --
dbg-log.org