David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
992 stars 64 forks source link

Added option to specify custom llama model with prompt #7

Closed JoseConseco closed 9 months ago

JoseConseco commented 9 months ago

This way coding prompts could use eg. codellama, grammar, summary - mistral translate - could some other model. Example:

     prompts['Change_Code']= {
            prompt =
"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n$input\n[CODE]\n$text\n[/CODE]\n".."\n\n### Response:",
            replace = false,
            extract = "```$filetype\n(.-)```",
            model = "mywizard_coder:latest",

      }
    end,    
David-Kunz commented 9 months ago

Hi @JoseConseco ,

Thank you for your PR!

This should also work without your change, no?

local opts = vim.tbl_deep_extend('force', {
        model = M.model,
        command = M.command
    }, { model = 'foobar' })
print(opts.model) -- 'foobar'
JoseConseco commented 9 months ago

You are right. We can add prom.model='xxx' indeed overriding M.model field. There is no need for this PR at all! My bad