Closed vick08 closed 4 months ago
It's mine. If you switch provider new session need to be started. Only one provider should be used inside session.
To prevent this error you can start new session manually with C-u M-x ellama-chat
or C-u C-c e a i
and select other model. And I will see how to fix it.
Also, what are you doing after switching providers? In what moment you get an error? On new ellama-chat call?
Also, is your ellama updated or you have an old version?
I will try to reproduce and fix it tonight, but I need your answers.
Yes, my Ellama package (and all other packages) are up-to-date. My steps include:
Should works fine on 0.8.2
Thanks. This seems to work now!
Here is my approximate config for Ellama: (require 'markdown-mode) (require 'ellama) (require 'llm-gemini) (require 'llm-ollama) (require 'llm-openai)
(use-package ellama :init (setopt ellama-provider (make-llm-ollama :chat-model "zephyr" :embedding-model "zephyr")) (setopt ellama-providers '(("Gemini" . (make-llm-gemini :key (auth-source-pass-get 'secret "gemini-api"))) ("OpenAI" . (make-llm-openai :key (auth-source-pass-get 'secret "openai-api"))) )) )
Everything works fine during the initial session, Zephyr by default. If I decide to switch to "OpenAI", for example using "c-c e p s", I get the response similar to this: Wrong type argument: llm-chat-prompt-interaction, [28705 13 28789 28766 1838 28766 28767 13 16230 28725 ...]
Here "OpenAI" is just an example. I would get a different argument formatting error if I switched to Gemini, for instance. Similarly, if my initial provider was set to "OpenAI" Or "Gemini, i.e. my first prompt would go to that service, switching to any other provider on the fly would produce similar or related errors.
Is this a limitation of the underlying "LLM" library or I am doing something incorrectly in my setup?