Closed vick08 closed 6 months ago
There are 2 options - ellama-provider will be used by default. Ellama-providers can contain multiple providers for fast switching between them in runtime. Can you show your configuration (redacted, without real keys)?
Yes, I think that was my mistake. I was putting entries into "ellama-provider" variable thinking that "ellama-providers" was optional since Ollama already returned the list of installed models.
Anyway, after moving my entries to "ellama-providers", the providers seem to work only the first time. When I try switching between providers the second time around, I start getting errors about missing parameters (not sure if the LLM package gets confused). Here is a copy of the region in question: (require 'ellama) (require 'llm-gemini) (require 'llm-ollama) (require 'llm-openai)
;;; Ellama: (use-package ellama :init (setopt ellama-provider (make-llm-ollama :chat-model "zephyr" :embedding-model "zephyr")) (setopt ellama-providers '(("Gemini" . (make-llm-gemini :key "my_key")) ("OpenAI" . (make-llm-openai :key "my_key")) )) )
But this has become a different issue now. Do let me know if you'd like me to open a new one! Thank you!
We can continue here. Which missing parameters you see in errors?
Closing this as original issue solved. Feel free to reopen or open new one.
Unless I am doing something wrong, I currently do not see a way to use more than a single ellama provider. Ellama seems to be coded Ollama by default, but the underlying LLM package offers the ability to use other services such as Gemini, Open AI, etc. I did notice that if I place multiple "ellama-provider" options next to each other along with the appropriate "requires", the last one is being used. Even in that instance, calling "ellama-provider-select" attempts to default to Ollama or otherwise an empty string. Hope I described that well. Thank you!