CopilotC-Nvim / CopilotChat.nvim

Chat with GitHub Copilot in Neovim
https://copilotc-nvim.github.io/CopilotChat.nvim/
GNU General Public License v3.0
1.74k stars 84 forks source link

Copilot Extensions specific configuration support #526

Open biosugar0 opened 4 hours ago

biosugar0 commented 4 hours ago

Thank you for adding such a convenient feature!
https://github.com/CopilotC-Nvim/CopilotChat.nvim/pull/490

It would be even more useful if we could configure settings for each agent individually. For example, Perplexity AI allows you to use models like those described here:
https://docs.perplexity.ai/guides/model-cards

It might be helpful to have a configuration like the example below:

local opts = {
  debug = false,
  model = 'claude-3.5-sonnet', -- default model
  agents = { -- agent-specific configurations
    perplexityai = {
      model = 'llama-3.1-sonar-huge-128k-online', -- agent-specific model
    },
  },
  prompts = prompts,
}
local chat = require('CopilotChat')
chat.setup(opts)
biosugar0 commented 4 hours ago

I haven’t conducted a detailed investigation yet, but since these Agents can be custom-built, it might be better to allow flexibility in their settings, depending on the case. For example, while the "model" parameter mentioned is relatively general, there could potentially be parameters specific to each Agent.