David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
977 stars 62 forks source link

Not trigger the ollama #83

Closed gzfrozen closed 3 months ago

gzfrozen commented 3 months ago

Only successfully generate some content once.

After that ollama seems not be triggered (the RAM usage is normal). The window is empty for every without any debug message.

image

Other tools using ollama works fine. (Like raycast ollama extension).

My setting:

return {
  "David-Kunz/gen.nvim",
  cond = No_vscode,
  event = "VeryLazy",
  config = function()
    require("gen").prompts["Explain_Code"] = {
      prompt = "Explain the following code in $filetype:\n```\n$text\n```",
    }
  end,
  opts = {
    model = "codellama:7b-instruct",
    show_model = true,
    debug = true,
  },
  keys = {
    {
      "<leader>gs",
      "<cmd>Gen<CR>",
      mode = { "n", "x" },
      desc = "[S]tart [G]enrate with llm",
    },
    {
      "<leader>gc",
      "<cmd>Gen Chat<CR>",
      mode = { "n", "x" },
      desc = "[C]ontinue [C]hat with llm",
    },
  },
}
gzfrozen commented 3 months ago

Sorry my bad. The setting is wrong. The right one would be:

return {
  "David-Kunz/gen.nvim",
  cond = No_vscode,
  -- event = "VeryLazy",
  config = function()
    local gen = require("gen")
    gen.setup({
      model = "codellama:7b-instruct",
      show_model = true,
      debug = true,
    })
    gen.prompts["Explain_Code"] = {
      prompt = "Explain the following code in $filetype:\n```\n$text\n```",
    }
  end,
  keys = {
    {
      "<leader>gs",
      "<cmd>Gen<CR>",
      mode = { "n", "x" },
      desc = "[S]tart [G]enrate with llm",
    },
    {
      "<leader>gc",
      "<cmd>Gen Chat<CR>",
      mode = { "n", "x" },
      desc = "[C]ontinue [C]hat with llm",
    },
  },
}