David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
977 stars 62 forks source link

I got error when I ran Gen #89

Closed a1401358759 closed 3 months ago

a1401358759 commented 3 months ago

this is my config

  {
    "David-Kunz/gen.nvim",
    event = "VeryLazy",
    config = function()
      local gen = require("gen")
      gen.setup({
        model = "llama2",
        display_mode = "float", -- The display mode. Can be "float" or "split".
        show_prompt = true, -- Shows the prompt submitted to Ollama.
        show_model = true, -- Displays which model you are using at the beginning of your chat session.
        no_auto_close = false, -- Never closes the window automatically.
        debug = false, -- Prints errors and the command which is run.
      })
      gen.prompts["Explain_Code"] = {
        prompt = "Explain the following code in $filetype:\n```\n$text\n```",
      }
      gen.prompts["Fix_Code"] = {
        prompt = "Fix the following code. Only ouput the result in format ```$filetype\n...\n```:\n```$filetype\n$text\n```",
        replace = true,
        extract = "```$filetype\n(.-)```",
      }
    end,
    keys = {
      {
        "<leader>sm",
        function()
          require("gen").select_model()
        end,
        desc = "Select Ollama Model",
      },
    },
  }

but i got this error image

life00 commented 3 months ago

I also wanted to report this issue. It appears that this is caused when show_prompt = true (by default its false). It also seems that this bug was recently introduced because I don't remember getting into it few weeks ago. Also as the result, user prompts are not shown.

A more appropriate issue title would be: gen.nvim errors when show_prompt=true

life00 commented 3 months ago

https://github.com/David-Kunz/gen.nvim/pull/90 PR appears to fix the issue. Please merge.

David-Kunz commented 3 months ago

Thanks for reporting the issue and fixing it! ❤️