s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
348 stars 25 forks source link

Code complete - Only 1st code block is added to current buffer #129

Open GitHubGeek opened 1 week ago

GitHubGeek commented 1 week ago

Summary: when using ellama-code-complete in an existing buffer, only the 1st code block is added to current buffer. Any non-code text and 2nd code block are missing.

Ollama terminal output showing 2 blocks of code (Model: phind-codellama:34b-v2)

image

Running ellama-code-complete:

image

Seems to me Ellama filtered out any text outside the 1st Markdown code block.

It'd be great if the non-code output is inserted as code comments, and the 2nd code block is also included in the buffer.

s-kostyaev commented 1 week ago

It'd be great if the non-code output is inserted as code comments, and the 2nd code block is also included in the buffer.

Sounds like a good idea. I will think how to implement it.

s-kostyaev commented 1 week ago

Until it will be implemented as a workaround you can use chat or ellama-ask-about instead.

GitHubGeek commented 1 week ago

Is there a config option to have the output in plain text instead of markdown?

s-kostyaev commented 1 week ago

Is there a config option to have the output in plain text instead of markdown?

try this

(setopt ellama-major-mode 'fundamental-mode)
GitHubGeek commented 1 week ago

Seem to have no impact? My doom emacs config.el section:

(use-package! ellama
  :config
  (setopt ellama-language "English")
  (require 'llm-ollama)
  (setopt ellama-major-mode 'fundamental-mode)
  (setopt ellama-provider
          (make-llm-ollama
           :chat-model "phind-codellama:34b-v2"
           :embedding-model "phind-codellama:34b-v2")))
s-kostyaev commented 1 week ago

Seem to have no impact? My doom emacs config.el section:

(use-package! ellama
  :config
  (setopt ellama-language "English")
  (require 'llm-ollama)
  (setopt ellama-major-mode 'fundamental-mode)
  (setopt ellama-provider
          (make-llm-ollama
           :chat-model "phind-codellama:34b-v2"
           :embedding-model "phind-codellama:34b-v2")))

Please open another issue for this. I will see later.

s-kostyaev commented 5 days ago

@GitHubGeek try to improve template:

(setopt ellama-code-complete-prompt-template "Continue the following code, only write new code in format ```language\n...\n```:\n```\n%s\n```\nWrite all the code in single code block.")

After that try your original prompt.