s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
492 stars 30 forks source link

SOLVED Integration without the installation script was failling #21

Closed jcintasr closed 9 months ago

jcintasr commented 10 months ago

I installed ollama through pacman and pulled the zephyr model with ollama run zephyr, but the default configuration for Emacs wasn't working for me. I solved it by changing it to:

(use-package ellama
  :init
  (setopt ellama-language "English")
  (require 'llm-ollama)
  (setopt ellama-provider
          (make-llm-ollama
           :chat-model "zephyr" :embedding-model "zephyr")))

Now it seems to be working, but I must run "ollama serve" in a terminal before (this is intended).

I hope this helps to solve similar issues.

EDIT: I uncommented the (require 'llm-ollama). I tested in another set up and it was necessary.

s-kostyaev commented 10 months ago

There is ollama systemd service in archlinux. You can enable and start it and it should work without any additional commands in terminal.

tusharhero commented 10 months ago

Isn't this how its supposed to be?

jcintasr commented 10 months ago

Isn't this how its supposed to be?

No, I commented out the llm line and modified the chat model.

Sorry if this is supposed to be understood, I don't have a lot of experience tweaking Emacs, let alone LLMs.

The issue is achieved to run it making those changes and maybe someone like me found the same difficulties.

s-kostyaev commented 9 months ago

Closing it now. If someone also had same problem will see how to improve documentation.