s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
359 stars 25 forks source link

Switching between Ellama providers generates parameter mismatch errors #73

Closed vick08 closed 4 months ago

vick08 commented 4 months ago

Here is my approximate config for Ellama: (require 'markdown-mode) (require 'ellama) (require 'llm-gemini) (require 'llm-ollama) (require 'llm-openai)

(use-package ellama :init (setopt ellama-provider (make-llm-ollama :chat-model "zephyr" :embedding-model "zephyr")) (setopt ellama-providers '(("Gemini" . (make-llm-gemini :key (auth-source-pass-get 'secret "gemini-api"))) ("OpenAI" . (make-llm-openai :key (auth-source-pass-get 'secret "openai-api"))) )) )

Everything works fine during the initial session, Zephyr by default. If I decide to switch to "OpenAI", for example using "c-c e p s", I get the response similar to this: Wrong type argument: llm-chat-prompt-interaction, [28705 13 28789 28766 1838 28766 28767 13 16230 28725 ...]

Here "OpenAI" is just an example. I would get a different argument formatting error if I switched to Gemini, for instance. Similarly, if my initial provider was set to "OpenAI" Or "Gemini, i.e. my first prompt would go to that service, switching to any other provider on the fly would produce similar or related errors.

Is this a limitation of the underlying "LLM" library or I am doing something incorrectly in my setup?

s-kostyaev commented 4 months ago

It's mine. If you switch provider new session need to be started. Only one provider should be used inside session.

To prevent this error you can start new session manually with C-u M-x ellama-chat or C-u C-c e a i and select other model. And I will see how to fix it.

s-kostyaev commented 4 months ago

Also, what are you doing after switching providers? In what moment you get an error? On new ellama-chat call?

s-kostyaev commented 4 months ago

Also, is your ellama updated or you have an old version?

s-kostyaev commented 4 months ago

I will try to reproduce and fix it tonight, but I need your answers.

vick08 commented 4 months ago

Yes, my Ellama package (and all other packages) are up-to-date. My steps include:

  1. Send a prompt to the default model Zephyr) using "ellama-chat" or with the shortcut key -- doesn't matter.
  2. Switch to another provider, say, "OpenAI".
  3. Send the same prompt or a new one using "ellama-chat" or with a shortcut key. This is where I would get an error. The errors only happen if I switch Ellama providers. If i stay within the same provider, say Ollama, and only switch models, then everything works fine. Hope this helps! And thank you for your wonderful package. I love it and use it every day!
s-kostyaev commented 4 months ago

Should works fine on 0.8.2

vick08 commented 4 months ago

Thanks. This seems to work now!