David-Kunz / gen.nvim

Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
1.15k stars 92 forks source link

Customize scheme and path #116

Open rdmcguire opened 1 month ago

rdmcguire commented 1 month ago

This is a great plugin! I have ollama running behind an Istio gateway in kubernetes, I'd like to be able to set the URI to something like https://myhost.mydomain/ai rather than having host and port parameters without support for TLS or serving the api out of a sub-path. I'm not awesome at lua, but I could scrap together a PR if desired.

FlippingBinary commented 4 weeks ago

The plugin uses http in both the default command and list_models functions that are in the setup options. So I just set a custom command and list_models that is a copy of the default function with http changed to https to get TLS support. This works for me:

return {
  "David-Kunz/gen.nvim",
  opts = {
    command = function(options)
      local body = { model = options.model, stream = true }
      return "curl --silent --no-buffer -X POST https://" .. options.host .. ":" .. options.port .. "/api/chat -d $body"
    end,
    list_models = function(options)
        local response = vim.fn.systemlist(
                             "curl --silent --no-buffer https://" .. options.host ..
                                 ":" .. options.port .. "/api/tags")
        local list = vim.fn.json_decode(response)
        local models = {}
        for key, _ in pairs(list.models) do
            table.insert(models, list.models[key].name)
        end
        table.sort(models)
        return models
    end,
    -- ... other options
  }
}

In your case, you'd probably want to also set /api/chat to /ai/api/chat and /api/tags to /ai/api/tags or whatever works.

David-Kunz commented 3 days ago

Thank you @FlippingBinary and @rdmcguire , I hope the suggestion helps in your case!